Test Report: Docker_Linux_containerd_arm64 22021

                    
                      714686ca7bbd77e34d847e892f53d4af2ede556f:2025-12-02:42609
                    
                

Test fail (25/321)

Order failed test Duration
171 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy 507.66
173 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart 369.03
175 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods 2.32
185 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd 2.27
186 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly 2.33
187 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig 734.82
188 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth 2.27
191 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService 0.06
194 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd 1.71
197 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd 3.1
201 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect 2.49
203 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim 241.69
213 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels 1.42
219 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel 0.59
222 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup 0.11
223 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect 92.8
228 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp 0.06
229 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List 0.25
230 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput 0.27
231 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS 0.29
232 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format 0.26
233 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL 0.26
237 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port 2.37
358 TestKubernetesUpgrade 792.37
486 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 7200.143
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy (507.66s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-449836 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
E1202 19:00:46.066361    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:01:13.770098    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:00.706193    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:00.712567    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:00.723916    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:00.745309    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:00.786769    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:00.868248    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:01.029766    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:01.351597    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:01.993712    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:03.275962    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:05.837456    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:10.958960    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:21.200374    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:03:41.681835    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:04:22.645119    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:05:44.569750    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:05:46.065858    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-449836 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0: exit status 109 (8m26.204073912s)

                                                
                                                
-- stdout --
	* [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22021
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on user configuration
	* Using Docker driver with root privileges
	* Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	* Pulling base image v0.0.48-1764169655-21974 ...
	* Found network options:
	  - HTTP_PROXY=localhost:41973
	* Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Local proxy ignored: not passing HTTP_PROXY=localhost:41973 to docker env.
	! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-449836 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-449836 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000911274s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001152361s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001152361s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:2241: failed minikube start. args "out/minikube-linux-arm64 start -p functional-449836 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0": exit status 109
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 6 (338.030274ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1202 19:07:18.779739   48797 status.go:458] kubeconfig endpoint: get endpoint: "functional-449836" does not appear in /home/jenkins/minikube-integration/22021-2487/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:247: status error: exit status 6 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-224594 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                        │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh sudo cat /etc/ssl/certs/44352.pem                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image load --daemon kicbase/echo-server:functional-224594 --alsologtostderr                                                                   │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh sudo cat /usr/share/ca-certificates/44352.pem                                                                                             │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh sudo cat /etc/test/nested/copy/4435/hosts                                                                                                 │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image save kicbase/echo-server:functional-224594 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image rm kicbase/echo-server:functional-224594 --alsologtostderr                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ update-context │ functional-224594 update-context --alsologtostderr -v=2                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image save --daemon kicbase/echo-server:functional-224594 --alsologtostderr                                                                   │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ update-context │ functional-224594 update-context --alsologtostderr -v=2                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ update-context │ functional-224594 update-context --alsologtostderr -v=2                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls --format short --alsologtostderr                                                                                                     │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls --format yaml --alsologtostderr                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh pgrep buildkitd                                                                                                                           │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ image          │ functional-224594 image ls --format json --alsologtostderr                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls --format table --alsologtostderr                                                                                                     │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image build -t localhost/my-image:functional-224594 testdata/build --alsologtostderr                                                          │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ delete         │ -p functional-224594                                                                                                                                            │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ start          │ -p functional-449836 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0         │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 18:58:52
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 18:58:52.273912   42791 out.go:360] Setting OutFile to fd 1 ...
	I1202 18:58:52.274090   42791 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:58:52.274094   42791 out.go:374] Setting ErrFile to fd 2...
	I1202 18:58:52.274098   42791 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:58:52.274358   42791 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 18:58:52.274761   42791 out.go:368] Setting JSON to false
	I1202 18:58:52.275597   42791 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":2469,"bootTime":1764699464,"procs":154,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 18:58:52.275653   42791 start.go:143] virtualization:  
	I1202 18:58:52.282673   42791 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 18:58:52.286341   42791 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 18:58:52.286437   42791 notify.go:221] Checking for updates...
	I1202 18:58:52.293348   42791 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 18:58:52.296563   42791 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 18:58:52.299657   42791 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 18:58:52.302656   42791 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 18:58:52.305714   42791 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 18:58:52.309055   42791 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 18:58:52.330835   42791 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 18:58:52.330951   42791 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 18:58:52.400766   42791 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-02 18:58:52.390664424 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 18:58:52.400856   42791 docker.go:319] overlay module found
	I1202 18:58:52.406278   42791 out.go:179] * Using the docker driver based on user configuration
	I1202 18:58:52.409219   42791 start.go:309] selected driver: docker
	I1202 18:58:52.409230   42791 start.go:927] validating driver "docker" against <nil>
	I1202 18:58:52.409241   42791 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 18:58:52.409987   42791 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 18:58:52.462910   42791 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:23 OomKillDisable:true NGoroutines:43 SystemTime:2025-12-02 18:58:52.454023149 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 18:58:52.463067   42791 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1202 18:58:52.463284   42791 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1202 18:58:52.466422   42791 out.go:179] * Using Docker driver with root privileges
	I1202 18:58:52.469262   42791 cni.go:84] Creating CNI manager for ""
	I1202 18:58:52.469324   42791 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 18:58:52.469331   42791 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1202 18:58:52.469405   42791 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP
: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 18:58:52.472637   42791 out.go:179] * Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	I1202 18:58:52.475539   42791 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 18:58:52.478433   42791 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 18:58:52.481328   42791 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 18:58:52.481478   42791 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 18:58:52.501368   42791 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 18:58:52.501379   42791 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 18:58:52.546972   42791 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 18:58:52.726074   42791 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 18:58:52.726240   42791 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 18:58:52.726338   42791 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 18:58:52.726347   42791 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 120.41µs
	I1202 18:58:52.726360   42791 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 18:58:52.726370   42791 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 18:58:52.726398   42791 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 18:58:52.726409   42791 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 33.551µs
	I1202 18:58:52.726415   42791 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 18:58:52.726423   42791 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 18:58:52.726433   42791 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 18:58:52.726451   42791 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 18:58:52.726455   42791 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 33.124µs
	I1202 18:58:52.726460   42791 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 18:58:52.726456   42791 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json: {Name:mk64bea15d4652689d28dddc7b023cf0d077a8b9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 18:58:52.726469   42791 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 18:58:52.726547   42791 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 18:58:52.726551   42791 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 83.118µs
	I1202 18:58:52.726556   42791 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 18:58:52.726563   42791 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 18:58:52.726588   42791 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 18:58:52.726592   42791 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 29.383µs
	I1202 18:58:52.726596   42791 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 18:58:52.726604   42791 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 18:58:52.726627   42791 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 18:58:52.726631   42791 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 28.291µs
	I1202 18:58:52.726632   42791 cache.go:243] Successfully downloaded all kic artifacts
	I1202 18:58:52.726635   42791 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 18:58:52.726643   42791 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 18:58:52.726648   42791 start.go:360] acquireMachinesLock for functional-449836: {Name:mk8999fdfa518fc15358d07431fe9bec286a035e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 18:58:52.726667   42791 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 18:58:52.726671   42791 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 29.112µs
	I1202 18:58:52.726675   42791 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 18:58:52.726682   42791 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 18:58:52.726689   42791 start.go:364] duration metric: took 33.485µs to acquireMachinesLock for "functional-449836"
	I1202 18:58:52.726706   42791 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 18:58:52.726709   42791 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 28.234µs
	I1202 18:58:52.726713   42791 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 18:58:52.726720   42791 cache.go:87] Successfully saved all images to host disk.
	I1202 18:58:52.726705   42791 start.go:93] Provisioning new machine with config: &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: AP
IServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNS
Log:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1202 18:58:52.726761   42791 start.go:125] createHost starting for "" (driver="docker")
	I1202 18:58:52.731715   42791 out.go:252] * Creating docker container (CPUs=2, Memory=4096MB) ...
	W1202 18:58:52.731973   42791 out.go:285] ! Local proxy ignored: not passing HTTP_PROXY=localhost:41973 to docker env.
	I1202 18:58:52.732040   42791 start.go:159] libmachine.API.Create for "functional-449836" (driver="docker")
	I1202 18:58:52.732062   42791 client.go:173] LocalClient.Create starting
	I1202 18:58:52.732160   42791 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem
	I1202 18:58:52.732191   42791 main.go:143] libmachine: Decoding PEM data...
	I1202 18:58:52.732211   42791 main.go:143] libmachine: Parsing certificate...
	I1202 18:58:52.732255   42791 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem
	I1202 18:58:52.732271   42791 main.go:143] libmachine: Decoding PEM data...
	I1202 18:58:52.732281   42791 main.go:143] libmachine: Parsing certificate...
	I1202 18:58:52.732658   42791 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1202 18:58:52.748895   42791 cli_runner.go:211] docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1202 18:58:52.748961   42791 network_create.go:284] running [docker network inspect functional-449836] to gather additional debugging logs...
	I1202 18:58:52.748976   42791 cli_runner.go:164] Run: docker network inspect functional-449836
	W1202 18:58:52.766301   42791 cli_runner.go:211] docker network inspect functional-449836 returned with exit code 1
	I1202 18:58:52.766319   42791 network_create.go:287] error running [docker network inspect functional-449836]: docker network inspect functional-449836: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network functional-449836 not found
	I1202 18:58:52.766330   42791 network_create.go:289] output of [docker network inspect functional-449836]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network functional-449836 not found
	
	** /stderr **
	I1202 18:58:52.766419   42791 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 18:58:52.783516   42791 network.go:206] using free private subnet 192.168.49.0/24: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x40019f3110}
	I1202 18:58:52.783549   42791 network_create.go:124] attempt to create docker network functional-449836 192.168.49.0/24 with gateway 192.168.49.1 and MTU of 1500 ...
	I1202 18:58:52.783610   42791 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.49.0/24 --gateway=192.168.49.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=functional-449836 functional-449836
	I1202 18:58:52.840600   42791 network_create.go:108] docker network functional-449836 192.168.49.0/24 created
	I1202 18:58:52.840623   42791 kic.go:121] calculated static IP "192.168.49.2" for the "functional-449836" container
	I1202 18:58:52.840716   42791 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1202 18:58:52.856411   42791 cli_runner.go:164] Run: docker volume create functional-449836 --label name.minikube.sigs.k8s.io=functional-449836 --label created_by.minikube.sigs.k8s.io=true
	I1202 18:58:52.874845   42791 oci.go:103] Successfully created a docker volume functional-449836
	I1202 18:58:52.874916   42791 cli_runner.go:164] Run: docker run --rm --name functional-449836-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-449836 --entrypoint /usr/bin/test -v functional-449836:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b -d /var/lib
	I1202 18:58:53.430102   42791 oci.go:107] Successfully prepared a docker volume functional-449836
	I1202 18:58:53.430167   42791 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	W1202 18:58:53.430314   42791 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1202 18:58:53.430424   42791 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1202 18:58:53.500097   42791 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname functional-449836 --name functional-449836 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=functional-449836 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=functional-449836 --network functional-449836 --ip 192.168.49.2 --volume functional-449836:/var --security-opt apparmor=unconfined --memory=4096mb --cpus=2 -e container=docker --expose 8441 --publish=127.0.0.1::8441 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b
	I1202 18:58:53.804471   42791 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Running}}
	I1202 18:58:53.828082   42791 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 18:58:53.853234   42791 cli_runner.go:164] Run: docker exec functional-449836 stat /var/lib/dpkg/alternatives/iptables
	I1202 18:58:53.905695   42791 oci.go:144] the created container "functional-449836" has a running status.
	I1202 18:58:53.905717   42791 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa...
	I1202 18:58:54.847185   42791 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1202 18:58:54.867784   42791 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 18:58:54.885541   42791 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1202 18:58:54.885552   42791 kic_runner.go:114] Args: [docker exec --privileged functional-449836 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1202 18:58:54.927203   42791 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 18:58:54.945323   42791 machine.go:94] provisionDockerMachine start ...
	I1202 18:58:54.945432   42791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 18:58:54.963475   42791 main.go:143] libmachine: Using SSH client type: native
	I1202 18:58:54.963807   42791 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 18:58:54.963813   42791 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 18:58:54.964491   42791 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1202 18:58:58.116148   42791 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 18:58:58.116162   42791 ubuntu.go:182] provisioning hostname "functional-449836"
	I1202 18:58:58.116225   42791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 18:58:58.134154   42791 main.go:143] libmachine: Using SSH client type: native
	I1202 18:58:58.134451   42791 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 18:58:58.134459   42791 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-449836 && echo "functional-449836" | sudo tee /etc/hostname
	I1202 18:58:58.289274   42791 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 18:58:58.289343   42791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 18:58:58.307705   42791 main.go:143] libmachine: Using SSH client type: native
	I1202 18:58:58.308012   42791 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 18:58:58.308025   42791 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-449836' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-449836/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-449836' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 18:58:58.456411   42791 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 18:58:58.456428   42791 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 18:58:58.456455   42791 ubuntu.go:190] setting up certificates
	I1202 18:58:58.456463   42791 provision.go:84] configureAuth start
	I1202 18:58:58.456536   42791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 18:58:58.474527   42791 provision.go:143] copyHostCerts
	I1202 18:58:58.474586   42791 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 18:58:58.474598   42791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 18:58:58.474676   42791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 18:58:58.474771   42791 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 18:58:58.474775   42791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 18:58:58.474798   42791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 18:58:58.474872   42791 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 18:58:58.474876   42791 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 18:58:58.474897   42791 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 18:58:58.474940   42791 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.functional-449836 san=[127.0.0.1 192.168.49.2 functional-449836 localhost minikube]
	I1202 18:58:58.650878   42791 provision.go:177] copyRemoteCerts
	I1202 18:58:58.650932   42791 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 18:58:58.650970   42791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 18:58:58.673717   42791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 18:58:58.776275   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 18:58:58.794520   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1202 18:58:58.812473   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 18:58:58.830080   42791 provision.go:87] duration metric: took 373.595013ms to configureAuth
	I1202 18:58:58.830097   42791 ubuntu.go:206] setting minikube options for container-runtime
	I1202 18:58:58.830287   42791 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 18:58:58.830292   42791 machine.go:97] duration metric: took 3.88495686s to provisionDockerMachine
	I1202 18:58:58.830298   42791 client.go:176] duration metric: took 6.098232424s to LocalClient.Create
	I1202 18:58:58.830311   42791 start.go:167] duration metric: took 6.09827249s to libmachine.API.Create "functional-449836"
	I1202 18:58:58.830316   42791 start.go:293] postStartSetup for "functional-449836" (driver="docker")
	I1202 18:58:58.830326   42791 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 18:58:58.830373   42791 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 18:58:58.830418   42791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 18:58:58.849514   42791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 18:58:58.956256   42791 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 18:58:58.959670   42791 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 18:58:58.959687   42791 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 18:58:58.959696   42791 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 18:58:58.959756   42791 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 18:58:58.959842   42791 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 18:58:58.959915   42791 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> hosts in /etc/test/nested/copy/4435
	I1202 18:58:58.959959   42791 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4435
	I1202 18:58:58.967908   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 18:58:58.987264   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts --> /etc/test/nested/copy/4435/hosts (40 bytes)
	I1202 18:58:59.004178   42791 start.go:296] duration metric: took 173.849302ms for postStartSetup
	I1202 18:58:59.004555   42791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 18:58:59.021952   42791 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 18:58:59.022233   42791 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 18:58:59.022275   42791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 18:58:59.039993   42791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 18:58:59.140891   42791 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 18:58:59.145197   42791 start.go:128] duration metric: took 6.418423516s to createHost
	I1202 18:58:59.145212   42791 start.go:83] releasing machines lock for "functional-449836", held for 6.41851662s
	I1202 18:58:59.145276   42791 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 18:58:59.166136   42791 out.go:179] * Found network options:
	I1202 18:58:59.169013   42791 out.go:179]   - HTTP_PROXY=localhost:41973
	W1202 18:58:59.171845   42791 out.go:285] ! You appear to be using a proxy, but your NO_PROXY environment does not include the minikube IP (192.168.49.2).
	I1202 18:58:59.174600   42791 out.go:179] * Please see https://minikube.sigs.k8s.io/docs/handbook/vpn_and_proxy/ for more details
	I1202 18:58:59.177463   42791 ssh_runner.go:195] Run: cat /version.json
	I1202 18:58:59.177509   42791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 18:58:59.177526   42791 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 18:58:59.177588   42791 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 18:58:59.195418   42791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 18:58:59.205926   42791 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 18:58:59.381034   42791 ssh_runner.go:195] Run: systemctl --version
	I1202 18:58:59.387462   42791 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1202 18:58:59.391569   42791 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 18:58:59.391631   42791 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 18:58:59.420373   42791 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1202 18:58:59.420386   42791 start.go:496] detecting cgroup driver to use...
	I1202 18:58:59.420417   42791 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 18:58:59.420479   42791 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 18:58:59.435229   42791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 18:58:59.448969   42791 docker.go:218] disabling cri-docker service (if available) ...
	I1202 18:58:59.449021   42791 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 18:58:59.466577   42791 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 18:58:59.484976   42791 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 18:58:59.596402   42791 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 18:58:59.722993   42791 docker.go:234] disabling docker service ...
	I1202 18:58:59.723080   42791 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 18:58:59.744938   42791 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 18:58:59.757997   42791 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 18:58:59.872052   42791 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 18:59:00.000506   42791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 18:59:00.061711   42791 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 18:59:00.115661   42791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 18:59:00.136188   42791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 18:59:00.181662   42791 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 18:59:00.181736   42791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 18:59:00.217867   42791 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 18:59:00.230431   42791 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 18:59:00.253271   42791 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 18:59:00.267908   42791 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 18:59:00.282430   42791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 18:59:00.307530   42791 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 18:59:00.319002   42791 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 18:59:00.331376   42791 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 18:59:00.341925   42791 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 18:59:00.351412   42791 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 18:59:00.485623   42791 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 18:59:00.575139   42791 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 18:59:00.575198   42791 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 18:59:00.579244   42791 start.go:564] Will wait 60s for crictl version
	I1202 18:59:00.579308   42791 ssh_runner.go:195] Run: which crictl
	I1202 18:59:00.583071   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 18:59:00.609320   42791 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 18:59:00.609377   42791 ssh_runner.go:195] Run: containerd --version
	I1202 18:59:00.629873   42791 ssh_runner.go:195] Run: containerd --version
	I1202 18:59:00.655344   42791 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 18:59:00.658158   42791 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 18:59:00.673966   42791 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1202 18:59:00.678164   42791 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.49.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1202 18:59:00.688208   42791 kubeadm.go:884] updating cluster {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 18:59:00.688307   42791 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 18:59:00.688387   42791 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 18:59:00.712591   42791 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-beta.0". assuming images are not preloaded.
	I1202 18:59:00.712606   42791 cache_images.go:90] LoadCachedImages start: [registry.k8s.io/kube-apiserver:v1.35.0-beta.0 registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 registry.k8s.io/kube-scheduler:v1.35.0-beta.0 registry.k8s.io/kube-proxy:v1.35.0-beta.0 registry.k8s.io/pause:3.10.1 registry.k8s.io/etcd:3.6.5-0 registry.k8s.io/coredns/coredns:v1.13.1 gcr.io/k8s-minikube/storage-provisioner:v5]
	I1202 18:59:00.712653   42791 image.go:138] retrieving image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 18:59:00.712868   42791 image.go:138] retrieving image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 18:59:00.712951   42791 image.go:138] retrieving image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 18:59:00.713027   42791 image.go:138] retrieving image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 18:59:00.713103   42791 image.go:138] retrieving image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 18:59:00.713175   42791 image.go:138] retrieving image: registry.k8s.io/pause:3.10.1
	I1202 18:59:00.713247   42791 image.go:138] retrieving image: registry.k8s.io/etcd:3.6.5-0
	I1202 18:59:00.713316   42791 image.go:138] retrieving image: registry.k8s.io/coredns/coredns:v1.13.1
	I1202 18:59:00.715961   42791 image.go:181] daemon lookup for registry.k8s.io/kube-scheduler:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 18:59:00.716393   42791 image.go:181] daemon lookup for registry.k8s.io/kube-proxy:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 18:59:00.716650   42791 image.go:181] daemon lookup for registry.k8s.io/kube-apiserver:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 18:59:00.716770   42791 image.go:181] daemon lookup for gcr.io/k8s-minikube/storage-provisioner:v5: Error response from daemon: No such image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 18:59:00.716855   42791 image.go:181] daemon lookup for registry.k8s.io/pause:3.10.1: Error response from daemon: No such image: registry.k8s.io/pause:3.10.1
	I1202 18:59:00.717010   42791 image.go:181] daemon lookup for registry.k8s.io/etcd:3.6.5-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.6.5-0
	I1202 18:59:00.717015   42791 image.go:181] daemon lookup for registry.k8s.io/coredns/coredns:v1.13.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.13.1
	I1202 18:59:00.717164   42791 image.go:181] daemon lookup for registry.k8s.io/kube-controller-manager:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 18:59:01.056450   42791 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" and sha "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be"
	I1202 18:59:01.056510   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 18:59:01.076156   42791 containerd.go:267] Checking existence of image with name "registry.k8s.io/etcd:3.6.5-0" and sha "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42"
	I1202 18:59:01.076224   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/etcd:3.6.5-0
	I1202 18:59:01.076290   42791 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-proxy:v1.35.0-beta.0" and sha "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904"
	I1202 18:59:01.076352   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 18:59:01.079907   42791 cache_images.go:118] "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" does not exist at hash "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be" in container runtime
	I1202 18:59:01.079940   42791 cri.go:218] Removing image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 18:59:01.079987   42791 ssh_runner.go:195] Run: which crictl
	I1202 18:59:01.085118   42791 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" and sha "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4"
	I1202 18:59:01.085176   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 18:59:01.097878   42791 containerd.go:267] Checking existence of image with name "registry.k8s.io/coredns/coredns:v1.13.1" and sha "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf"
	I1202 18:59:01.097941   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/coredns/coredns:v1.13.1
	I1202 18:59:01.113700   42791 cache_images.go:118] "registry.k8s.io/kube-proxy:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-proxy:v1.35.0-beta.0" does not exist at hash "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904" in container runtime
	I1202 18:59:01.113731   42791 cri.go:218] Removing image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 18:59:01.113738   42791 cache_images.go:118] "registry.k8s.io/etcd:3.6.5-0" needs transfer: "registry.k8s.io/etcd:3.6.5-0" does not exist at hash "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42" in container runtime
	I1202 18:59:01.113757   42791 cri.go:218] Removing image: registry.k8s.io/etcd:3.6.5-0
	I1202 18:59:01.113781   42791 ssh_runner.go:195] Run: which crictl
	I1202 18:59:01.113793   42791 ssh_runner.go:195] Run: which crictl
	I1202 18:59:01.113862   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 18:59:01.138908   42791 cache_images.go:118] "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" does not exist at hash "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4" in container runtime
	I1202 18:59:01.138939   42791 cri.go:218] Removing image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 18:59:01.139013   42791 ssh_runner.go:195] Run: which crictl
	I1202 18:59:01.139100   42791 cache_images.go:118] "registry.k8s.io/coredns/coredns:v1.13.1" needs transfer: "registry.k8s.io/coredns/coredns:v1.13.1" does not exist at hash "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf" in container runtime
	I1202 18:59:01.139114   42791 cri.go:218] Removing image: registry.k8s.io/coredns/coredns:v1.13.1
	I1202 18:59:01.139147   42791 ssh_runner.go:195] Run: which crictl
	I1202 18:59:01.139223   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 18:59:01.139336   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1202 18:59:01.141027   42791 containerd.go:267] Checking existence of image with name "registry.k8s.io/pause:3.10.1" and sha "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd"
	I1202 18:59:01.141124   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/pause:3.10.1
	I1202 18:59:01.157527   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 18:59:01.213944   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1202 18:59:01.213999   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 18:59:01.214017   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 18:59:01.214071   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1202 18:59:01.214100   42791 cache_images.go:118] "registry.k8s.io/pause:3.10.1" needs transfer: "registry.k8s.io/pause:3.10.1" does not exist at hash "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd" in container runtime
	I1202 18:59:01.214122   42791 cri.go:218] Removing image: registry.k8s.io/pause:3.10.1
	I1202 18:59:01.214151   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 18:59:01.214162   42791 ssh_runner.go:195] Run: which crictl
	I1202 18:59:01.232524   42791 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" and sha "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b"
	I1202 18:59:01.232592   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 18:59:01.293606   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1202 18:59:01.293681   42791 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0
	I1202 18:59:01.293766   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1202 18:59:01.293817   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1202 18:59:01.293827   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 18:59:01.298772   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 18:59:01.298855   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1202 18:59:01.299517   42791 cache_images.go:118] "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" does not exist at hash "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b" in container runtime
	I1202 18:59:01.299544   42791 cri.go:218] Removing image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 18:59:01.299603   42791 ssh_runner.go:195] Run: which crictl
	I1202 18:59:01.384460   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 18:59:01.384485   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1202 18:59:01.384537   42791 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0': No such file or directory
	I1202 18:59:01.384551   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0 (20672000 bytes)
	I1202 18:59:01.384580   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1202 18:59:01.384632   42791 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0
	I1202 18:59:01.384653   42791 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0
	I1202 18:59:01.384698   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1202 18:59:01.384720   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/etcd_3.6.5-0
	I1202 18:59:01.384763   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 18:59:01.494028   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 18:59:01.494097   42791 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0
	I1202 18:59:01.494164   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1202 18:59:01.494219   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1202 18:59:01.494257   42791 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1
	I1202 18:59:01.494295   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1
	I1202 18:59:01.494345   42791 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-proxy_v1.35.0-beta.0': No such file or directory
	I1202 18:59:01.494360   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0 (22432256 bytes)
	I1202 18:59:01.494397   42791 ssh_runner.go:352] existence check for /var/lib/minikube/images/etcd_3.6.5-0: stat -c "%s %y" /var/lib/minikube/images/etcd_3.6.5-0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/etcd_3.6.5-0': No such file or directory
	I1202 18:59:01.494404   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 --> /var/lib/minikube/images/etcd_3.6.5-0 (21148160 bytes)
	I1202 18:59:01.574725   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 18:59:01.574780   42791 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0': No such file or directory
	I1202 18:59:01.574793   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0 (24689152 bytes)
	I1202 18:59:01.574844   42791 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1
	I1202 18:59:01.574897   42791 ssh_runner.go:352] existence check for /var/lib/minikube/images/coredns_v1.13.1: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/coredns_v1.13.1': No such file or directory
	I1202 18:59:01.574910   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1
	I1202 18:59:01.574916   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 --> /var/lib/minikube/images/coredns_v1.13.1 (21178368 bytes)
	I1202 18:59:01.679697   42791 ssh_runner.go:352] existence check for /var/lib/minikube/images/pause_3.10.1: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/pause_3.10.1': No such file or directory
	I1202 18:59:01.679725   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 --> /var/lib/minikube/images/pause_3.10.1 (268288 bytes)
	I1202 18:59:01.679802   42791 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0
	I1202 18:59:01.679883   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1202 18:59:01.754104   42791 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0': No such file or directory
	I1202 18:59:01.754156   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0 (15401984 bytes)
	I1202 18:59:01.810610   42791 containerd.go:285] Loading image: /var/lib/minikube/images/pause_3.10.1
	I1202 18:59:01.810672   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/pause_3.10.1
	W1202 18:59:02.103750   42791 image.go:286] image gcr.io/k8s-minikube/storage-provisioner:v5 arch mismatch: want arm64 got amd64. fixing
	I1202 18:59:02.103872   42791 containerd.go:267] Checking existence of image with name "gcr.io/k8s-minikube/storage-provisioner:v5" and sha "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51"
	I1202 18:59:02.103935   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 18:59:02.162822   42791 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 from cache
	I1202 18:59:02.162847   42791 containerd.go:285] Loading image: /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1202 18:59:02.162909   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1202 18:59:02.181523   42791 cache_images.go:118] "gcr.io/k8s-minikube/storage-provisioner:v5" needs transfer: "gcr.io/k8s-minikube/storage-provisioner:v5" does not exist at hash "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51" in container runtime
	I1202 18:59:02.181553   42791 cri.go:218] Removing image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 18:59:02.181619   42791 ssh_runner.go:195] Run: which crictl
	I1202 18:59:03.531560   42791 ssh_runner.go:235] Completed: which crictl: (1.349924164s)
	I1202 18:59:03.531616   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 18:59:03.531664   42791 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: (1.368744999s)
	I1202 18:59:03.531673   42791 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 from cache
	I1202 18:59:03.531688   42791 containerd.go:285] Loading image: /var/lib/minikube/images/etcd_3.6.5-0
	I1202 18:59:03.531723   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.6.5-0
	I1202 18:59:04.927028   42791 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.6.5-0: (1.395284101s)
	I1202 18:59:04.927045   42791 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 from cache
	I1202 18:59:04.927099   42791 ssh_runner.go:235] Completed: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5: (1.395473229s)
	I1202 18:59:04.927165   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 18:59:04.927224   42791 containerd.go:285] Loading image: /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1202 18:59:04.927249   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1202 18:59:04.954933   42791 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 18:59:05.961324   42791 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: (1.034055749s)
	I1202 18:59:05.961340   42791 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 from cache
	I1202 18:59:05.961355   42791 containerd.go:285] Loading image: /var/lib/minikube/images/coredns_v1.13.1
	I1202 18:59:05.961402   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1
	I1202 18:59:05.961472   42791 ssh_runner.go:235] Completed: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5: (1.006527232s)
	I1202 18:59:05.961493   42791 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5
	I1202 18:59:05.961555   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5
	I1202 18:59:06.930859   42791 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 from cache
	I1202 18:59:06.930894   42791 containerd.go:285] Loading image: /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1202 18:59:06.930946   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1202 18:59:06.930963   42791 ssh_runner.go:352] existence check for /var/lib/minikube/images/storage-provisioner_v5: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/storage-provisioner_v5': No such file or directory
	I1202 18:59:06.930995   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 --> /var/lib/minikube/images/storage-provisioner_v5 (8035840 bytes)
	I1202 18:59:07.724713   42791 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 from cache
	I1202 18:59:07.724736   42791 containerd.go:285] Loading image: /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1202 18:59:07.724790   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1202 18:59:08.786365   42791 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: (1.061552643s)
	I1202 18:59:08.786390   42791 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 from cache
	I1202 18:59:08.786416   42791 containerd.go:285] Loading image: /var/lib/minikube/images/storage-provisioner_v5
	I1202 18:59:08.786462   42791 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/storage-provisioner_v5
	I1202 18:59:09.144197   42791 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 from cache
	I1202 18:59:09.144221   42791 cache_images.go:125] Successfully loaded all cached images
	I1202 18:59:09.144225   42791 cache_images.go:94] duration metric: took 8.431607178s to LoadCachedImages
	I1202 18:59:09.144237   42791 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1202 18:59:09.144381   42791 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-449836 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 18:59:09.144465   42791 ssh_runner.go:195] Run: sudo crictl info
	I1202 18:59:09.170770   42791 cni.go:84] Creating CNI manager for ""
	I1202 18:59:09.170778   42791 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 18:59:09.170795   42791 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 18:59:09.170816   42791 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-449836 NodeName:functional-449836 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 18:59:09.170919   42791 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-449836"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 18:59:09.170985   42791 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 18:59:09.178776   42791 binaries.go:54] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.35.0-beta.0': No such file or directory
	
	Initiating transfer...
	I1202 18:59:09.178831   42791 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 18:59:09.186769   42791 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl.sha256
	I1202 18:59:09.186851   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl
	I1202 18:59:09.186938   42791 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet.sha256
	I1202 18:59:09.186967   42791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 18:59:09.187047   42791 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1202 18:59:09.187095   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm
	I1202 18:59:09.191795   42791 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubectl': No such file or directory
	I1202 18:59:09.191822   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubectl --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl (55181496 bytes)
	I1202 18:59:09.206751   42791 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet
	I1202 18:59:09.207681   42791 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm': No such file or directory
	I1202 18:59:09.207704   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubeadm --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm (68354232 bytes)
	I1202 18:59:09.234713   42791 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet': No such file or directory
	I1202 18:59:09.234743   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubelet --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet (54329636 bytes)
	I1202 18:59:09.961960   42791 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 18:59:09.970944   42791 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1202 18:59:09.989968   42791 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 18:59:10.007101   42791 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1202 18:59:10.023260   42791 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1202 18:59:10.027311   42791 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.49.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1202 18:59:10.038571   42791 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 18:59:10.151309   42791 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 18:59:10.173782   42791 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836 for IP: 192.168.49.2
	I1202 18:59:10.173793   42791 certs.go:195] generating shared ca certs ...
	I1202 18:59:10.173808   42791 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 18:59:10.173989   42791 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 18:59:10.174028   42791 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 18:59:10.174033   42791 certs.go:257] generating profile certs ...
	I1202 18:59:10.174096   42791 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key
	I1202 18:59:10.174105   42791 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt with IP's: []
	I1202 18:59:10.350958   42791 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt ...
	I1202 18:59:10.350976   42791 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: {Name:mka3501c24c4a81a5cba9077ce4679d8fb7b6150 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 18:59:10.351202   42791 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key ...
	I1202 18:59:10.351209   42791 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key: {Name:mk6e8ec4e49bef25d90791ec183ac3c189612a66 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 18:59:10.351307   42791 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da
	I1202 18:59:10.351320   42791 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt.a65b71da with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.49.2]
	I1202 18:59:10.748305   42791 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt.a65b71da ...
	I1202 18:59:10.748327   42791 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt.a65b71da: {Name:mk837ab465d52cf51c941e1272f64ca5e5bdcb78 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 18:59:10.748521   42791 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da ...
	I1202 18:59:10.748528   42791 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da: {Name:mkaa0675b09445ccc00059b025a1d7bd37b168f1 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 18:59:10.748613   42791 certs.go:382] copying /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt.a65b71da -> /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt
	I1202 18:59:10.748687   42791 certs.go:386] copying /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da -> /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key
	I1202 18:59:10.748738   42791 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key
	I1202 18:59:10.748752   42791 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt with IP's: []
	I1202 18:59:11.063670   42791 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt ...
	I1202 18:59:11.063686   42791 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt: {Name:mk41c928bd4635bac6e4c433b422747dd3bec428 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 18:59:11.063882   42791 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key ...
	I1202 18:59:11.063908   42791 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key: {Name:mk6c4cf48ed7d0213c83dacf9677f8c21ee7e130 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 18:59:11.064120   42791 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 18:59:11.064163   42791 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 18:59:11.064171   42791 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 18:59:11.064200   42791 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 18:59:11.064224   42791 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 18:59:11.064246   42791 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 18:59:11.064290   42791 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 18:59:11.064951   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 18:59:11.083980   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 18:59:11.103611   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 18:59:11.123391   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 18:59:11.142411   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1202 18:59:11.160627   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 18:59:11.178247   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 18:59:11.195905   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 18:59:11.216207   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 18:59:11.235530   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 18:59:11.253846   42791 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 18:59:11.271165   42791 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 18:59:11.284566   42791 ssh_runner.go:195] Run: openssl version
	I1202 18:59:11.290675   42791 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 18:59:11.298989   42791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 18:59:11.302714   42791 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 18:59:11.302768   42791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 18:59:11.349229   42791 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 18:59:11.357787   42791 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 18:59:11.366075   42791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 18:59:11.370248   42791 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 18:59:11.370304   42791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 18:59:11.411393   42791 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 18:59:11.419953   42791 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 18:59:11.428712   42791 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 18:59:11.432428   42791 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 18:59:11.432481   42791 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 18:59:11.474041   42791 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 18:59:11.482618   42791 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 18:59:11.486320   42791 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1202 18:59:11.486362   42791 kubeadm.go:401] StartCluster: {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 18:59:11.486427   42791 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 18:59:11.486518   42791 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 18:59:11.513005   42791 cri.go:89] found id: ""
	I1202 18:59:11.513063   42791 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 18:59:11.521036   42791 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 18:59:11.529228   42791 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 18:59:11.529282   42791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 18:59:11.537318   42791 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 18:59:11.537327   42791 kubeadm.go:158] found existing configuration files:
	
	I1202 18:59:11.537387   42791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 18:59:11.545218   42791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 18:59:11.545285   42791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 18:59:11.553215   42791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 18:59:11.561314   42791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 18:59:11.561368   42791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 18:59:11.570130   42791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 18:59:11.578285   42791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 18:59:11.578373   42791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 18:59:11.585978   42791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 18:59:11.593813   42791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 18:59:11.593885   42791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 18:59:11.601780   42791 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 18:59:11.643892   42791 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 18:59:11.644125   42791 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 18:59:11.739484   42791 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 18:59:11.739547   42791 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 18:59:11.739597   42791 kubeadm.go:319] OS: Linux
	I1202 18:59:11.739641   42791 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 18:59:11.739688   42791 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 18:59:11.739733   42791 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 18:59:11.739780   42791 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 18:59:11.739827   42791 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 18:59:11.739873   42791 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 18:59:11.739917   42791 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 18:59:11.739963   42791 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 18:59:11.740008   42791 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 18:59:11.809299   42791 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 18:59:11.809420   42791 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 18:59:11.809521   42791 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 18:59:11.814586   42791 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 18:59:11.824092   42791 out.go:252]   - Generating certificates and keys ...
	I1202 18:59:11.824201   42791 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 18:59:11.824279   42791 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 18:59:11.970256   42791 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1202 18:59:12.248119   42791 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1202 18:59:12.556513   42791 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1202 18:59:12.623381   42791 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1202 18:59:12.726314   42791 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1202 18:59:12.726457   42791 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [functional-449836 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1202 18:59:12.887892   42791 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1202 18:59:12.888313   42791 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [functional-449836 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	I1202 18:59:13.069672   42791 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1202 18:59:13.685445   42791 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1202 18:59:14.118749   42791 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1202 18:59:14.118979   42791 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 18:59:14.342440   42791 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 18:59:14.713413   42791 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 18:59:15.152952   42791 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 18:59:15.551896   42791 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 18:59:15.713773   42791 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 18:59:15.714351   42791 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 18:59:15.717061   42791 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 18:59:15.748685   42791 out.go:252]   - Booting up control plane ...
	I1202 18:59:15.748787   42791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 18:59:15.748864   42791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 18:59:15.748929   42791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 18:59:15.749032   42791 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 18:59:15.749126   42791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 18:59:15.751143   42791 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 18:59:15.751619   42791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 18:59:15.751823   42791 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 18:59:15.891630   42791 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 18:59:15.891743   42791 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 19:03:15.892519   42791 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000911274s
	I1202 19:03:15.892544   42791 kubeadm.go:319] 
	I1202 19:03:15.892602   42791 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 19:03:15.892632   42791 kubeadm.go:319] 	- The kubelet is not running
	I1202 19:03:15.892731   42791 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 19:03:15.892736   42791 kubeadm.go:319] 
	I1202 19:03:15.892834   42791 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 19:03:15.892863   42791 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 19:03:15.892891   42791 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 19:03:15.892894   42791 kubeadm.go:319] 
	I1202 19:03:15.896498   42791 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 19:03:15.896940   42791 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 19:03:15.897049   42791 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 19:03:15.897307   42791 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 19:03:15.897323   42791 kubeadm.go:319] 
	I1202 19:03:15.897390   42791 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1202 19:03:15.897523   42791 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Generating "apiserver-kubelet-client" certificate and key
	[certs] Generating "front-proxy-ca" certificate and key
	[certs] Generating "front-proxy-client" certificate and key
	[certs] Generating "etcd/ca" certificate and key
	[certs] Generating "etcd/server" certificate and key
	[certs] etcd/server serving cert is signed for DNS names [functional-449836 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/peer" certificate and key
	[certs] etcd/peer serving cert is signed for DNS names [functional-449836 localhost] and IPs [192.168.49.2 127.0.0.1 ::1]
	[certs] Generating "etcd/healthcheck-client" certificate and key
	[certs] Generating "apiserver-etcd-client" certificate and key
	[certs] Generating "sa" key and public key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000911274s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1202 19:03:15.897619   42791 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 19:03:16.312004   42791 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:03:16.325429   42791 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 19:03:16.325485   42791 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:03:16.333446   42791 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 19:03:16.333454   42791 kubeadm.go:158] found existing configuration files:
	
	I1202 19:03:16.333508   42791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:03:16.340969   42791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 19:03:16.341023   42791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 19:03:16.348368   42791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:03:16.355741   42791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 19:03:16.355794   42791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:03:16.363295   42791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:03:16.371331   42791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 19:03:16.371385   42791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:03:16.378614   42791 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:03:16.386080   42791 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 19:03:16.386133   42791 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:03:16.393798   42791 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 19:03:16.510990   42791 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 19:03:16.511423   42791 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 19:03:16.578932   42791 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 19:07:18.007623   42791 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 19:07:18.007650   42791 kubeadm.go:319] 
	I1202 19:07:18.007723   42791 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1202 19:07:18.011332   42791 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 19:07:18.011398   42791 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 19:07:18.011490   42791 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 19:07:18.011544   42791 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 19:07:18.011579   42791 kubeadm.go:319] OS: Linux
	I1202 19:07:18.011622   42791 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 19:07:18.011668   42791 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 19:07:18.011721   42791 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 19:07:18.011767   42791 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 19:07:18.011814   42791 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 19:07:18.011865   42791 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 19:07:18.011912   42791 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 19:07:18.011963   42791 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 19:07:18.012007   42791 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 19:07:18.012085   42791 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 19:07:18.012189   42791 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 19:07:18.012281   42791 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 19:07:18.012374   42791 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 19:07:18.015497   42791 out.go:252]   - Generating certificates and keys ...
	I1202 19:07:18.015625   42791 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 19:07:18.015707   42791 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 19:07:18.015817   42791 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 19:07:18.015902   42791 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 19:07:18.015978   42791 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 19:07:18.016033   42791 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 19:07:18.016096   42791 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 19:07:18.016156   42791 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 19:07:18.016265   42791 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 19:07:18.016368   42791 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 19:07:18.016406   42791 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 19:07:18.016461   42791 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 19:07:18.016549   42791 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 19:07:18.016610   42791 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 19:07:18.016663   42791 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 19:07:18.016725   42791 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 19:07:18.016779   42791 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 19:07:18.016881   42791 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 19:07:18.016954   42791 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 19:07:18.020115   42791 out.go:252]   - Booting up control plane ...
	I1202 19:07:18.020248   42791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 19:07:18.020353   42791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 19:07:18.020418   42791 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 19:07:18.020526   42791 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 19:07:18.020618   42791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 19:07:18.020728   42791 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 19:07:18.020812   42791 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 19:07:18.020850   42791 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 19:07:18.020983   42791 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 19:07:18.021089   42791 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 19:07:18.021155   42791 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001152361s
	I1202 19:07:18.021158   42791 kubeadm.go:319] 
	I1202 19:07:18.021213   42791 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 19:07:18.021249   42791 kubeadm.go:319] 	- The kubelet is not running
	I1202 19:07:18.021353   42791 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 19:07:18.021356   42791 kubeadm.go:319] 
	I1202 19:07:18.021459   42791 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 19:07:18.021501   42791 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 19:07:18.021540   42791 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 19:07:18.021593   42791 kubeadm.go:319] 
	I1202 19:07:18.021604   42791 kubeadm.go:403] duration metric: took 8m6.535245053s to StartCluster
	I1202 19:07:18.021640   42791 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:07:18.021704   42791 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:07:18.046204   42791 cri.go:89] found id: ""
	I1202 19:07:18.046218   42791 logs.go:282] 0 containers: []
	W1202 19:07:18.046226   42791 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:07:18.046231   42791 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:07:18.046298   42791 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:07:18.074390   42791 cri.go:89] found id: ""
	I1202 19:07:18.074404   42791 logs.go:282] 0 containers: []
	W1202 19:07:18.074411   42791 logs.go:284] No container was found matching "etcd"
	I1202 19:07:18.074417   42791 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:07:18.074480   42791 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:07:18.099129   42791 cri.go:89] found id: ""
	I1202 19:07:18.099143   42791 logs.go:282] 0 containers: []
	W1202 19:07:18.099150   42791 logs.go:284] No container was found matching "coredns"
	I1202 19:07:18.099155   42791 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:07:18.099217   42791 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:07:18.124694   42791 cri.go:89] found id: ""
	I1202 19:07:18.124715   42791 logs.go:282] 0 containers: []
	W1202 19:07:18.124722   42791 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:07:18.124728   42791 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:07:18.124790   42791 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:07:18.150965   42791 cri.go:89] found id: ""
	I1202 19:07:18.150979   42791 logs.go:282] 0 containers: []
	W1202 19:07:18.150986   42791 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:07:18.150991   42791 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:07:18.151053   42791 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:07:18.176236   42791 cri.go:89] found id: ""
	I1202 19:07:18.176276   42791 logs.go:282] 0 containers: []
	W1202 19:07:18.176283   42791 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:07:18.176295   42791 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:07:18.176372   42791 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:07:18.201073   42791 cri.go:89] found id: ""
	I1202 19:07:18.201087   42791 logs.go:282] 0 containers: []
	W1202 19:07:18.201094   42791 logs.go:284] No container was found matching "kindnet"
	I1202 19:07:18.201102   42791 logs.go:123] Gathering logs for kubelet ...
	I1202 19:07:18.201112   42791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:07:18.256328   42791 logs.go:123] Gathering logs for dmesg ...
	I1202 19:07:18.256346   42791 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:07:18.267255   42791 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:07:18.267270   42791 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:07:18.334547   42791 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:07:18.325181    5393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:18.326090    5393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:18.327868    5393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:18.328694    5393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:18.330416    5393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:07:18.325181    5393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:18.326090    5393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:18.327868    5393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:18.328694    5393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:18.330416    5393 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:07:18.334559   42791 logs.go:123] Gathering logs for containerd ...
	I1202 19:07:18.334569   42791 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:07:18.377545   42791 logs.go:123] Gathering logs for container status ...
	I1202 19:07:18.377562   42791 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	W1202 19:07:18.405738   42791 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001152361s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1202 19:07:18.405778   42791 out.go:285] * 
	W1202 19:07:18.405837   42791 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001152361s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 19:07:18.405850   42791 out.go:285] * 
	W1202 19:07:18.408205   42791 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 19:07:18.413536   42791 out.go:203] 
	W1202 19:07:18.416345   42791 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001152361s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 19:07:18.416393   42791 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1202 19:07:18.416416   42791 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1202 19:07:18.419487   42791 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 18:59:03 functional-449836 containerd[764]: time="2025-12-02T18:59:03.531066084Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/kube-controller-manager:v1.35.0-beta.0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:04 functional-449836 containerd[764]: time="2025-12-02T18:59:04.918633971Z" level=info msg="No images store for sha256:89a52ae86f116708cd5ba0d54dfbf2ae3011f126ee9161c4afb19bf2a51ef285"
	Dec 02 18:59:04 functional-449836 containerd[764]: time="2025-12-02T18:59:04.920818079Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\""
	Dec 02 18:59:04 functional-449836 containerd[764]: time="2025-12-02T18:59:04.934352889Z" level=info msg="ImageCreate event name:\"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:04 functional-449836 containerd[764]: time="2025-12-02T18:59:04.935016922Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/etcd:3.6.5-0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:05 functional-449836 containerd[764]: time="2025-12-02T18:59:05.950748344Z" level=info msg="No images store for sha256:eb9020767c0d3bbd754f3f52cbe4c8bdd935dd5862604d6dc0b1f10422189544"
	Dec 02 18:59:05 functional-449836 containerd[764]: time="2025-12-02T18:59:05.952937269Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.0-beta.0\""
	Dec 02 18:59:05 functional-449836 containerd[764]: time="2025-12-02T18:59:05.960084990Z" level=info msg="ImageCreate event name:\"sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:05 functional-449836 containerd[764]: time="2025-12-02T18:59:05.960411226Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/kube-proxy:v1.35.0-beta.0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:06 functional-449836 containerd[764]: time="2025-12-02T18:59:06.919972924Z" level=info msg="No images store for sha256:5e4a4fe83792bf529a4e283e09069cf50cc9882d04168a33903ed6809a492e61"
	Dec 02 18:59:06 functional-449836 containerd[764]: time="2025-12-02T18:59:06.922088946Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\""
	Dec 02 18:59:06 functional-449836 containerd[764]: time="2025-12-02T18:59:06.941256244Z" level=info msg="ImageCreate event name:\"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:06 functional-449836 containerd[764]: time="2025-12-02T18:59:06.942044051Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:07 functional-449836 containerd[764]: time="2025-12-02T18:59:07.715916014Z" level=info msg="No images store for sha256:84ea4651cf4d4486006d1346129c6964687be99508987d0ca606406fbc15a298"
	Dec 02 18:59:07 functional-449836 containerd[764]: time="2025-12-02T18:59:07.718237115Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.0-beta.0\""
	Dec 02 18:59:07 functional-449836 containerd[764]: time="2025-12-02T18:59:07.728198868Z" level=info msg="ImageCreate event name:\"sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:07 functional-449836 containerd[764]: time="2025-12-02T18:59:07.729109925Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/kube-scheduler:v1.35.0-beta.0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:08 functional-449836 containerd[764]: time="2025-12-02T18:59:08.775930433Z" level=info msg="No images store for sha256:64f3fb0a3392f487dbd4300c920f76dc3de2961e11fd6bfbedc75c0d25b1954c"
	Dec 02 18:59:08 functional-449836 containerd[764]: time="2025-12-02T18:59:08.778191423Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.0-beta.0\""
	Dec 02 18:59:08 functional-449836 containerd[764]: time="2025-12-02T18:59:08.787016306Z" level=info msg="ImageCreate event name:\"sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:08 functional-449836 containerd[764]: time="2025-12-02T18:59:08.787945119Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/kube-apiserver:v1.35.0-beta.0\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:09 functional-449836 containerd[764]: time="2025-12-02T18:59:09.135403206Z" level=info msg="No images store for sha256:7475c7d18769df89a804d5bebf679dbf94886f3626f07a2be923beaa0cc7e5b0"
	Dec 02 18:59:09 functional-449836 containerd[764]: time="2025-12-02T18:59:09.138423671Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/storage-provisioner:v5\""
	Dec 02 18:59:09 functional-449836 containerd[764]: time="2025-12-02T18:59:09.146682786Z" level=info msg="ImageCreate event name:\"sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 18:59:09 functional-449836 containerd[764]: time="2025-12-02T18:59:09.147313481Z" level=info msg="ImageUpdate event name:\"gcr.io/k8s-minikube/storage-provisioner:v5\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:07:19.403916    5507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:19.404686    5507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:19.406375    5507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:19.407090    5507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:07:19.408738    5507 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:07:19 up 49 min,  0 user,  load average: 0.47, 0.53, 0.66
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:07:16 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:07:17 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 02 19:07:17 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:07:17 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:07:17 functional-449836 kubelet[5319]: E1202 19:07:17.220937    5319 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:07:17 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:07:17 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:07:17 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 02 19:07:17 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:07:17 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:07:17 functional-449836 kubelet[5324]: E1202 19:07:17.963966    5324 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:07:17 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:07:17 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:07:18 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 02 19:07:18 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:07:18 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:07:18 functional-449836 kubelet[5416]: E1202 19:07:18.740396    5416 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:07:18 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:07:18 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:07:19 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 02 19:07:19 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:07:19 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:07:19 functional-449836 kubelet[5512]: E1202 19:07:19.480083    5512 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:07:19 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:07:19 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 6 (333.240306ms)

                                                
                                                
-- stdout --
	Stopped
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E1202 19:07:19.868520   49017 status.go:458] kubeconfig endpoint: get endpoint: "functional-449836" does not appear in /home/jenkins/minikube-integration/22021-2487/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:262: status error: exit status 6 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/StartWithProxy (507.66s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart (369.03s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart
I1202 19:07:19.882640    4435 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-449836 --alsologtostderr -v=8
E1202 19:08:00.703530    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:08:28.411123    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:10:46.066185    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:12:09.132057    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:13:00.704154    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:674: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-449836 --alsologtostderr -v=8: exit status 80 (6m6.060429757s)

                                                
                                                
-- stdout --
	* [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22021
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	* Pulling base image v0.0.48-1764169655-21974 ...
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	* Verifying Kubernetes components...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: 
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1202 19:07:19.929855   49088 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:07:19.930082   49088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:07:19.930109   49088 out.go:374] Setting ErrFile to fd 2...
	I1202 19:07:19.930127   49088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:07:19.930424   49088 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:07:19.930829   49088 out.go:368] Setting JSON to false
	I1202 19:07:19.931678   49088 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":2976,"bootTime":1764699464,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:07:19.931776   49088 start.go:143] virtualization:  
	I1202 19:07:19.935245   49088 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:07:19.939094   49088 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:07:19.939188   49088 notify.go:221] Checking for updates...
	I1202 19:07:19.944799   49088 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:07:19.947646   49088 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:19.950501   49088 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:07:19.953361   49088 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:07:19.956281   49088 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:07:19.959695   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:19.959887   49088 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:07:19.996438   49088 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:07:19.996577   49088 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:07:20.063124   49088 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:07:20.053388152 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:07:20.063232   49088 docker.go:319] overlay module found
	I1202 19:07:20.066390   49088 out.go:179] * Using the docker driver based on existing profile
	I1202 19:07:20.069271   49088 start.go:309] selected driver: docker
	I1202 19:07:20.069311   49088 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:20.069422   49088 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:07:20.069541   49088 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:07:20.132012   49088 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:07:20.122627931 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:07:20.132615   49088 cni.go:84] Creating CNI manager for ""
	I1202 19:07:20.132692   49088 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:07:20.132751   49088 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:20.135845   49088 out.go:179] * Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	I1202 19:07:20.138639   49088 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 19:07:20.141498   49088 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 19:07:20.144479   49088 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 19:07:20.144604   49088 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:07:20.163347   49088 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 19:07:20.163372   49088 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 19:07:20.218193   49088 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 19:07:20.422833   49088 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 19:07:20.423042   49088 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 19:07:20.423128   49088 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423219   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 19:07:20.423234   49088 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 122.125µs
	I1202 19:07:20.423249   49088 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 19:07:20.423267   49088 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423303   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 19:07:20.423312   49088 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 47.557µs
	I1202 19:07:20.423318   49088 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423331   49088 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423365   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 19:07:20.423374   49088 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 44.415µs
	I1202 19:07:20.423380   49088 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423395   49088 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423422   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 19:07:20.423432   49088 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 37.579µs
	I1202 19:07:20.423438   49088 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423447   49088 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423476   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 19:07:20.423484   49088 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 38.933µs
	I1202 19:07:20.423490   49088 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423510   49088 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423540   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 19:07:20.423549   49088 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 40.796µs
	I1202 19:07:20.423555   49088 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 19:07:20.423569   49088 cache.go:243] Successfully downloaded all kic artifacts
	I1202 19:07:20.423588   49088 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423620   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 19:07:20.423629   49088 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 42.487µs
	I1202 19:07:20.423635   49088 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 19:07:20.423646   49088 start.go:360] acquireMachinesLock for functional-449836: {Name:mk8999fdfa518fc15358d07431fe9bec286a035e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423706   49088 start.go:364] duration metric: took 31.868µs to acquireMachinesLock for "functional-449836"
	I1202 19:07:20.423570   49088 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423753   49088 start.go:96] Skipping create...Using existing machine configuration
	I1202 19:07:20.423783   49088 fix.go:54] fixHost starting: 
	I1202 19:07:20.423759   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 19:07:20.423888   49088 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 323.2µs
	I1202 19:07:20.423896   49088 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 19:07:20.423906   49088 cache.go:87] Successfully saved all images to host disk.
	I1202 19:07:20.424111   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:20.441213   49088 fix.go:112] recreateIfNeeded on functional-449836: state=Running err=<nil>
	W1202 19:07:20.441244   49088 fix.go:138] unexpected machine state, will restart: <nil>
	I1202 19:07:20.444707   49088 out.go:252] * Updating the running docker "functional-449836" container ...
	I1202 19:07:20.444749   49088 machine.go:94] provisionDockerMachine start ...
	I1202 19:07:20.444842   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.461943   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.462269   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.462284   49088 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 19:07:20.612055   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:07:20.612125   49088 ubuntu.go:182] provisioning hostname "functional-449836"
	I1202 19:07:20.612222   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.629856   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.630166   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.630180   49088 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-449836 && echo "functional-449836" | sudo tee /etc/hostname
	I1202 19:07:20.793419   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:07:20.793536   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.812441   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.812754   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.812775   49088 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-449836' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-449836/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-449836' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 19:07:20.961443   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 19:07:20.961480   49088 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 19:07:20.961539   49088 ubuntu.go:190] setting up certificates
	I1202 19:07:20.961556   49088 provision.go:84] configureAuth start
	I1202 19:07:20.961634   49088 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:07:20.990731   49088 provision.go:143] copyHostCerts
	I1202 19:07:20.990790   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:07:20.990838   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 19:07:20.990856   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:07:20.990938   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 19:07:20.991037   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:07:20.991060   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 19:07:20.991069   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:07:20.991098   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 19:07:20.991189   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:07:20.991211   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 19:07:20.991220   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:07:20.991247   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 19:07:20.991297   49088 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.functional-449836 san=[127.0.0.1 192.168.49.2 functional-449836 localhost minikube]
	I1202 19:07:21.335552   49088 provision.go:177] copyRemoteCerts
	I1202 19:07:21.335618   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 19:07:21.335658   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.354079   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.460475   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1202 19:07:21.460535   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1202 19:07:21.478965   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1202 19:07:21.479028   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 19:07:21.497363   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1202 19:07:21.497471   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 19:07:21.514946   49088 provision.go:87] duration metric: took 553.36724ms to configureAuth
	I1202 19:07:21.515020   49088 ubuntu.go:206] setting minikube options for container-runtime
	I1202 19:07:21.515215   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:21.515248   49088 machine.go:97] duration metric: took 1.070490831s to provisionDockerMachine
	I1202 19:07:21.515264   49088 start.go:293] postStartSetup for "functional-449836" (driver="docker")
	I1202 19:07:21.515276   49088 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 19:07:21.515329   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 19:07:21.515382   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.532644   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.636416   49088 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 19:07:21.639685   49088 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1202 19:07:21.639756   49088 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1202 19:07:21.639777   49088 command_runner.go:130] > VERSION_ID="12"
	I1202 19:07:21.639798   49088 command_runner.go:130] > VERSION="12 (bookworm)"
	I1202 19:07:21.639827   49088 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1202 19:07:21.639832   49088 command_runner.go:130] > ID=debian
	I1202 19:07:21.639847   49088 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1202 19:07:21.639859   49088 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1202 19:07:21.639866   49088 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1202 19:07:21.639943   49088 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 19:07:21.639962   49088 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 19:07:21.639974   49088 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 19:07:21.640036   49088 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 19:07:21.640112   49088 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 19:07:21.640123   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> /etc/ssl/certs/44352.pem
	I1202 19:07:21.640204   49088 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> hosts in /etc/test/nested/copy/4435
	I1202 19:07:21.640213   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> /etc/test/nested/copy/4435/hosts
	I1202 19:07:21.640263   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4435
	I1202 19:07:21.647807   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:07:21.664872   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts --> /etc/test/nested/copy/4435/hosts (40 bytes)
	I1202 19:07:21.686465   49088 start.go:296] duration metric: took 171.184702ms for postStartSetup
	I1202 19:07:21.686545   49088 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:07:21.686646   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.708068   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.808826   49088 command_runner.go:130] > 18%
	I1202 19:07:21.809461   49088 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 19:07:21.814183   49088 command_runner.go:130] > 159G
	I1202 19:07:21.814719   49088 fix.go:56] duration metric: took 1.390932828s for fixHost
	I1202 19:07:21.814741   49088 start.go:83] releasing machines lock for "functional-449836", held for 1.391011327s
	I1202 19:07:21.814809   49088 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:07:21.831833   49088 ssh_runner.go:195] Run: cat /version.json
	I1202 19:07:21.831895   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.832169   49088 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 19:07:21.832229   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.852617   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.855772   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.955939   49088 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1764169655-21974", "minikube_version": "v1.37.0", "commit": "5499406178e21d60d74d327c9716de794e8a4797"}
	I1202 19:07:21.956090   49088 ssh_runner.go:195] Run: systemctl --version
	I1202 19:07:22.048548   49088 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1202 19:07:22.051368   49088 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1202 19:07:22.051402   49088 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1202 19:07:22.051488   49088 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1202 19:07:22.055900   49088 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1202 19:07:22.056072   49088 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 19:07:22.056144   49088 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 19:07:22.064483   49088 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1202 19:07:22.064507   49088 start.go:496] detecting cgroup driver to use...
	I1202 19:07:22.064540   49088 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 19:07:22.064608   49088 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 19:07:22.080944   49088 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 19:07:22.094328   49088 docker.go:218] disabling cri-docker service (if available) ...
	I1202 19:07:22.094412   49088 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 19:07:22.110538   49088 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 19:07:22.123916   49088 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 19:07:22.251555   49088 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 19:07:22.372403   49088 docker.go:234] disabling docker service ...
	I1202 19:07:22.372547   49088 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 19:07:22.390362   49088 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 19:07:22.404129   49088 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 19:07:22.527674   49088 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 19:07:22.641245   49088 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 19:07:22.654510   49088 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 19:07:22.669149   49088 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1202 19:07:22.670616   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 19:07:22.680782   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 19:07:22.690619   49088 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 19:07:22.690690   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 19:07:22.700650   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:07:22.710637   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 19:07:22.720237   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:07:22.730375   49088 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 19:07:22.738458   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 19:07:22.747256   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 19:07:22.756269   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 19:07:22.765824   49088 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 19:07:22.772632   49088 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1202 19:07:22.773683   49088 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 19:07:22.781384   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:22.894036   49088 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 19:07:22.996092   49088 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 19:07:22.996190   49088 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 19:07:23.000049   49088 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1202 19:07:23.000075   49088 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1202 19:07:23.000083   49088 command_runner.go:130] > Device: 0,72	Inode: 1611        Links: 1
	I1202 19:07:23.000090   49088 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1202 19:07:23.000119   49088 command_runner.go:130] > Access: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000134   49088 command_runner.go:130] > Modify: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000139   49088 command_runner.go:130] > Change: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000143   49088 command_runner.go:130] >  Birth: -
	I1202 19:07:23.000708   49088 start.go:564] Will wait 60s for crictl version
	I1202 19:07:23.000798   49088 ssh_runner.go:195] Run: which crictl
	I1202 19:07:23.004553   49088 command_runner.go:130] > /usr/local/bin/crictl
	I1202 19:07:23.004698   49088 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 19:07:23.031006   49088 command_runner.go:130] > Version:  0.1.0
	I1202 19:07:23.031142   49088 command_runner.go:130] > RuntimeName:  containerd
	I1202 19:07:23.031156   49088 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1202 19:07:23.031165   49088 command_runner.go:130] > RuntimeApiVersion:  v1
	I1202 19:07:23.033497   49088 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 19:07:23.033588   49088 ssh_runner.go:195] Run: containerd --version
	I1202 19:07:23.053512   49088 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1202 19:07:23.055064   49088 ssh_runner.go:195] Run: containerd --version
	I1202 19:07:23.073280   49088 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1202 19:07:23.080684   49088 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 19:07:23.083736   49088 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 19:07:23.100485   49088 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1202 19:07:23.104603   49088 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1202 19:07:23.104709   49088 kubeadm.go:884] updating cluster {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 19:07:23.104831   49088 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:07:23.104890   49088 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 19:07:23.127690   49088 command_runner.go:130] > {
	I1202 19:07:23.127710   49088 command_runner.go:130] >   "images":  [
	I1202 19:07:23.127715   49088 command_runner.go:130] >     {
	I1202 19:07:23.127725   49088 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1202 19:07:23.127729   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127744   49088 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1202 19:07:23.127750   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127755   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127759   49088 command_runner.go:130] >       "size":  "8032639",
	I1202 19:07:23.127765   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127776   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127781   49088 command_runner.go:130] >     },
	I1202 19:07:23.127784   49088 command_runner.go:130] >     {
	I1202 19:07:23.127792   49088 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1202 19:07:23.127800   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127806   49088 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1202 19:07:23.127813   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127817   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127822   49088 command_runner.go:130] >       "size":  "21166088",
	I1202 19:07:23.127826   49088 command_runner.go:130] >       "username":  "nonroot",
	I1202 19:07:23.127832   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127835   49088 command_runner.go:130] >     },
	I1202 19:07:23.127838   49088 command_runner.go:130] >     {
	I1202 19:07:23.127845   49088 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1202 19:07:23.127855   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127869   49088 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1202 19:07:23.127876   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127880   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127887   49088 command_runner.go:130] >       "size":  "21134420",
	I1202 19:07:23.127892   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.127899   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.127903   49088 command_runner.go:130] >       },
	I1202 19:07:23.127907   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127911   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127917   49088 command_runner.go:130] >     },
	I1202 19:07:23.127919   49088 command_runner.go:130] >     {
	I1202 19:07:23.127926   49088 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1202 19:07:23.127930   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127938   49088 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1202 19:07:23.127945   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127949   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127953   49088 command_runner.go:130] >       "size":  "24676285",
	I1202 19:07:23.127961   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.127965   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.127971   49088 command_runner.go:130] >       },
	I1202 19:07:23.127975   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127983   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127987   49088 command_runner.go:130] >     },
	I1202 19:07:23.127996   49088 command_runner.go:130] >     {
	I1202 19:07:23.128002   49088 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1202 19:07:23.128006   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128012   49088 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1202 19:07:23.128015   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128019   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128026   49088 command_runner.go:130] >       "size":  "20658969",
	I1202 19:07:23.128029   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128033   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.128041   49088 command_runner.go:130] >       },
	I1202 19:07:23.128052   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128059   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128063   49088 command_runner.go:130] >     },
	I1202 19:07:23.128070   49088 command_runner.go:130] >     {
	I1202 19:07:23.128077   49088 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1202 19:07:23.128081   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128088   49088 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1202 19:07:23.128092   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128096   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128099   49088 command_runner.go:130] >       "size":  "22428165",
	I1202 19:07:23.128103   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128109   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128113   49088 command_runner.go:130] >     },
	I1202 19:07:23.128116   49088 command_runner.go:130] >     {
	I1202 19:07:23.128123   49088 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1202 19:07:23.128130   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128135   49088 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1202 19:07:23.128143   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128152   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128160   49088 command_runner.go:130] >       "size":  "15389290",
	I1202 19:07:23.128163   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128167   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.128170   49088 command_runner.go:130] >       },
	I1202 19:07:23.128175   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128179   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128185   49088 command_runner.go:130] >     },
	I1202 19:07:23.128188   49088 command_runner.go:130] >     {
	I1202 19:07:23.128199   49088 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1202 19:07:23.128203   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128212   49088 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1202 19:07:23.128215   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128223   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128227   49088 command_runner.go:130] >       "size":  "265458",
	I1202 19:07:23.128238   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128243   49088 command_runner.go:130] >         "value":  "65535"
	I1202 19:07:23.128248   49088 command_runner.go:130] >       },
	I1202 19:07:23.128252   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128256   49088 command_runner.go:130] >       "pinned":  true
	I1202 19:07:23.128259   49088 command_runner.go:130] >     }
	I1202 19:07:23.128262   49088 command_runner.go:130] >   ]
	I1202 19:07:23.128265   49088 command_runner.go:130] > }
	I1202 19:07:23.130379   49088 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 19:07:23.130403   49088 cache_images.go:86] Images are preloaded, skipping loading
	I1202 19:07:23.130410   49088 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1202 19:07:23.130509   49088 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-449836 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 19:07:23.130576   49088 ssh_runner.go:195] Run: sudo crictl info
	I1202 19:07:23.152707   49088 command_runner.go:130] > {
	I1202 19:07:23.152731   49088 command_runner.go:130] >   "cniconfig": {
	I1202 19:07:23.152737   49088 command_runner.go:130] >     "Networks": [
	I1202 19:07:23.152741   49088 command_runner.go:130] >       {
	I1202 19:07:23.152746   49088 command_runner.go:130] >         "Config": {
	I1202 19:07:23.152752   49088 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1202 19:07:23.152758   49088 command_runner.go:130] >           "Name": "cni-loopback",
	I1202 19:07:23.152768   49088 command_runner.go:130] >           "Plugins": [
	I1202 19:07:23.152775   49088 command_runner.go:130] >             {
	I1202 19:07:23.152779   49088 command_runner.go:130] >               "Network": {
	I1202 19:07:23.152784   49088 command_runner.go:130] >                 "ipam": {},
	I1202 19:07:23.152789   49088 command_runner.go:130] >                 "type": "loopback"
	I1202 19:07:23.152798   49088 command_runner.go:130] >               },
	I1202 19:07:23.152803   49088 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1202 19:07:23.152810   49088 command_runner.go:130] >             }
	I1202 19:07:23.152814   49088 command_runner.go:130] >           ],
	I1202 19:07:23.152828   49088 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1202 19:07:23.152835   49088 command_runner.go:130] >         },
	I1202 19:07:23.152840   49088 command_runner.go:130] >         "IFName": "lo"
	I1202 19:07:23.152847   49088 command_runner.go:130] >       }
	I1202 19:07:23.152850   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152855   49088 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1202 19:07:23.152860   49088 command_runner.go:130] >     "PluginDirs": [
	I1202 19:07:23.152865   49088 command_runner.go:130] >       "/opt/cni/bin"
	I1202 19:07:23.152869   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152873   49088 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1202 19:07:23.152879   49088 command_runner.go:130] >     "Prefix": "eth"
	I1202 19:07:23.152883   49088 command_runner.go:130] >   },
	I1202 19:07:23.152891   49088 command_runner.go:130] >   "config": {
	I1202 19:07:23.152894   49088 command_runner.go:130] >     "cdiSpecDirs": [
	I1202 19:07:23.152898   49088 command_runner.go:130] >       "/etc/cdi",
	I1202 19:07:23.152907   49088 command_runner.go:130] >       "/var/run/cdi"
	I1202 19:07:23.152910   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152917   49088 command_runner.go:130] >     "cni": {
	I1202 19:07:23.152921   49088 command_runner.go:130] >       "binDir": "",
	I1202 19:07:23.152928   49088 command_runner.go:130] >       "binDirs": [
	I1202 19:07:23.152933   49088 command_runner.go:130] >         "/opt/cni/bin"
	I1202 19:07:23.152936   49088 command_runner.go:130] >       ],
	I1202 19:07:23.152941   49088 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1202 19:07:23.152947   49088 command_runner.go:130] >       "confTemplate": "",
	I1202 19:07:23.152954   49088 command_runner.go:130] >       "ipPref": "",
	I1202 19:07:23.152958   49088 command_runner.go:130] >       "maxConfNum": 1,
	I1202 19:07:23.152963   49088 command_runner.go:130] >       "setupSerially": false,
	I1202 19:07:23.152969   49088 command_runner.go:130] >       "useInternalLoopback": false
	I1202 19:07:23.152977   49088 command_runner.go:130] >     },
	I1202 19:07:23.152983   49088 command_runner.go:130] >     "containerd": {
	I1202 19:07:23.152992   49088 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1202 19:07:23.152997   49088 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1202 19:07:23.153006   49088 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1202 19:07:23.153010   49088 command_runner.go:130] >       "runtimes": {
	I1202 19:07:23.153017   49088 command_runner.go:130] >         "runc": {
	I1202 19:07:23.153022   49088 command_runner.go:130] >           "ContainerAnnotations": null,
	I1202 19:07:23.153026   49088 command_runner.go:130] >           "PodAnnotations": null,
	I1202 19:07:23.153031   49088 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1202 19:07:23.153035   49088 command_runner.go:130] >           "cgroupWritable": false,
	I1202 19:07:23.153041   49088 command_runner.go:130] >           "cniConfDir": "",
	I1202 19:07:23.153046   49088 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1202 19:07:23.153053   49088 command_runner.go:130] >           "io_type": "",
	I1202 19:07:23.153058   49088 command_runner.go:130] >           "options": {
	I1202 19:07:23.153066   49088 command_runner.go:130] >             "BinaryName": "",
	I1202 19:07:23.153071   49088 command_runner.go:130] >             "CriuImagePath": "",
	I1202 19:07:23.153079   49088 command_runner.go:130] >             "CriuWorkPath": "",
	I1202 19:07:23.153083   49088 command_runner.go:130] >             "IoGid": 0,
	I1202 19:07:23.153091   49088 command_runner.go:130] >             "IoUid": 0,
	I1202 19:07:23.153096   49088 command_runner.go:130] >             "NoNewKeyring": false,
	I1202 19:07:23.153100   49088 command_runner.go:130] >             "Root": "",
	I1202 19:07:23.153104   49088 command_runner.go:130] >             "ShimCgroup": "",
	I1202 19:07:23.153111   49088 command_runner.go:130] >             "SystemdCgroup": false
	I1202 19:07:23.153115   49088 command_runner.go:130] >           },
	I1202 19:07:23.153120   49088 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1202 19:07:23.153128   49088 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1202 19:07:23.153136   49088 command_runner.go:130] >           "runtimePath": "",
	I1202 19:07:23.153143   49088 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1202 19:07:23.153237   49088 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1202 19:07:23.153375   49088 command_runner.go:130] >           "snapshotter": ""
	I1202 19:07:23.153385   49088 command_runner.go:130] >         }
	I1202 19:07:23.153389   49088 command_runner.go:130] >       }
	I1202 19:07:23.153393   49088 command_runner.go:130] >     },
	I1202 19:07:23.153414   49088 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1202 19:07:23.153424   49088 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1202 19:07:23.153435   49088 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1202 19:07:23.153444   49088 command_runner.go:130] >     "disableApparmor": false,
	I1202 19:07:23.153449   49088 command_runner.go:130] >     "disableHugetlbController": true,
	I1202 19:07:23.153457   49088 command_runner.go:130] >     "disableProcMount": false,
	I1202 19:07:23.153467   49088 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1202 19:07:23.153475   49088 command_runner.go:130] >     "enableCDI": true,
	I1202 19:07:23.153479   49088 command_runner.go:130] >     "enableSelinux": false,
	I1202 19:07:23.153484   49088 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1202 19:07:23.153490   49088 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1202 19:07:23.153500   49088 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1202 19:07:23.153508   49088 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1202 19:07:23.153516   49088 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1202 19:07:23.153522   49088 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1202 19:07:23.153534   49088 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1202 19:07:23.153544   49088 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1202 19:07:23.153549   49088 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1202 19:07:23.153562   49088 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1202 19:07:23.153570   49088 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1202 19:07:23.153575   49088 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1202 19:07:23.153578   49088 command_runner.go:130] >   },
	I1202 19:07:23.153582   49088 command_runner.go:130] >   "features": {
	I1202 19:07:23.153588   49088 command_runner.go:130] >     "supplemental_groups_policy": true
	I1202 19:07:23.153597   49088 command_runner.go:130] >   },
	I1202 19:07:23.153605   49088 command_runner.go:130] >   "golang": "go1.24.9",
	I1202 19:07:23.153615   49088 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1202 19:07:23.153633   49088 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1202 19:07:23.153644   49088 command_runner.go:130] >   "runtimeHandlers": [
	I1202 19:07:23.153649   49088 command_runner.go:130] >     {
	I1202 19:07:23.153658   49088 command_runner.go:130] >       "features": {
	I1202 19:07:23.153664   49088 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1202 19:07:23.153669   49088 command_runner.go:130] >         "user_namespaces": true
	I1202 19:07:23.153675   49088 command_runner.go:130] >       }
	I1202 19:07:23.153679   49088 command_runner.go:130] >     },
	I1202 19:07:23.153686   49088 command_runner.go:130] >     {
	I1202 19:07:23.153691   49088 command_runner.go:130] >       "features": {
	I1202 19:07:23.153703   49088 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1202 19:07:23.153708   49088 command_runner.go:130] >         "user_namespaces": true
	I1202 19:07:23.153715   49088 command_runner.go:130] >       },
	I1202 19:07:23.153720   49088 command_runner.go:130] >       "name": "runc"
	I1202 19:07:23.153727   49088 command_runner.go:130] >     }
	I1202 19:07:23.153731   49088 command_runner.go:130] >   ],
	I1202 19:07:23.153738   49088 command_runner.go:130] >   "status": {
	I1202 19:07:23.153742   49088 command_runner.go:130] >     "conditions": [
	I1202 19:07:23.153746   49088 command_runner.go:130] >       {
	I1202 19:07:23.153751   49088 command_runner.go:130] >         "message": "",
	I1202 19:07:23.153757   49088 command_runner.go:130] >         "reason": "",
	I1202 19:07:23.153766   49088 command_runner.go:130] >         "status": true,
	I1202 19:07:23.153774   49088 command_runner.go:130] >         "type": "RuntimeReady"
	I1202 19:07:23.153781   49088 command_runner.go:130] >       },
	I1202 19:07:23.153785   49088 command_runner.go:130] >       {
	I1202 19:07:23.153792   49088 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1202 19:07:23.153797   49088 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1202 19:07:23.153805   49088 command_runner.go:130] >         "status": false,
	I1202 19:07:23.153810   49088 command_runner.go:130] >         "type": "NetworkReady"
	I1202 19:07:23.153814   49088 command_runner.go:130] >       },
	I1202 19:07:23.153820   49088 command_runner.go:130] >       {
	I1202 19:07:23.153824   49088 command_runner.go:130] >         "message": "",
	I1202 19:07:23.153828   49088 command_runner.go:130] >         "reason": "",
	I1202 19:07:23.153836   49088 command_runner.go:130] >         "status": true,
	I1202 19:07:23.153850   49088 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1202 19:07:23.153857   49088 command_runner.go:130] >       }
	I1202 19:07:23.153861   49088 command_runner.go:130] >     ]
	I1202 19:07:23.153868   49088 command_runner.go:130] >   }
	I1202 19:07:23.153871   49088 command_runner.go:130] > }
	I1202 19:07:23.157283   49088 cni.go:84] Creating CNI manager for ""
	I1202 19:07:23.157307   49088 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:07:23.157324   49088 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 19:07:23.157352   49088 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-449836 NodeName:functional-449836 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 19:07:23.157503   49088 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-449836"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 19:07:23.157589   49088 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 19:07:23.165274   49088 command_runner.go:130] > kubeadm
	I1202 19:07:23.165296   49088 command_runner.go:130] > kubectl
	I1202 19:07:23.165301   49088 command_runner.go:130] > kubelet
	I1202 19:07:23.166244   49088 binaries.go:51] Found k8s binaries, skipping transfer
	I1202 19:07:23.166309   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 19:07:23.176520   49088 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1202 19:07:23.191534   49088 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 19:07:23.207596   49088 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1202 19:07:23.221899   49088 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1202 19:07:23.225538   49088 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1202 19:07:23.225972   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:23.344071   49088 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:07:24.171449   49088 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836 for IP: 192.168.49.2
	I1202 19:07:24.171473   49088 certs.go:195] generating shared ca certs ...
	I1202 19:07:24.171491   49088 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.171633   49088 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 19:07:24.171683   49088 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 19:07:24.171697   49088 certs.go:257] generating profile certs ...
	I1202 19:07:24.171794   49088 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key
	I1202 19:07:24.171860   49088 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da
	I1202 19:07:24.171905   49088 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key
	I1202 19:07:24.171916   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1202 19:07:24.171929   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1202 19:07:24.171946   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1202 19:07:24.171957   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1202 19:07:24.171972   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1202 19:07:24.171985   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1202 19:07:24.172001   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1202 19:07:24.172012   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1202 19:07:24.172062   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 19:07:24.172113   49088 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 19:07:24.172126   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 19:07:24.172154   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 19:07:24.172189   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 19:07:24.172215   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 19:07:24.172266   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:07:24.172298   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.172314   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.172347   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem -> /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.172878   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 19:07:24.192840   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 19:07:24.210709   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 19:07:24.228270   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 19:07:24.246519   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1202 19:07:24.264649   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 19:07:24.283289   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 19:07:24.302316   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 19:07:24.320907   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 19:07:24.338895   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 19:07:24.356995   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 19:07:24.374784   49088 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 19:07:24.388173   49088 ssh_runner.go:195] Run: openssl version
	I1202 19:07:24.394457   49088 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1202 19:07:24.394840   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 19:07:24.403512   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407229   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407385   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407455   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.448501   49088 command_runner.go:130] > 3ec20f2e
	I1202 19:07:24.448942   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 19:07:24.456981   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 19:07:24.465478   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469306   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469374   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469438   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.510270   49088 command_runner.go:130] > b5213941
	I1202 19:07:24.510784   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 19:07:24.518790   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 19:07:24.527001   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.530919   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.530959   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.531008   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.571727   49088 command_runner.go:130] > 51391683
	I1202 19:07:24.572161   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 19:07:24.580157   49088 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:07:24.584062   49088 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:07:24.584087   49088 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1202 19:07:24.584094   49088 command_runner.go:130] > Device: 259,1	Inode: 848916      Links: 1
	I1202 19:07:24.584101   49088 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1202 19:07:24.584108   49088 command_runner.go:130] > Access: 2025-12-02 19:03:16.577964732 +0000
	I1202 19:07:24.584114   49088 command_runner.go:130] > Modify: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584119   49088 command_runner.go:130] > Change: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584125   49088 command_runner.go:130] >  Birth: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584207   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1202 19:07:24.630311   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.630810   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1202 19:07:24.671995   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.672412   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1202 19:07:24.713648   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.713758   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1202 19:07:24.754977   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.755077   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1202 19:07:24.800995   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.801486   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1202 19:07:24.844718   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.845325   49088 kubeadm.go:401] StartCluster: {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:24.845410   49088 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 19:07:24.845499   49088 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:07:24.875465   49088 cri.go:89] found id: ""
	I1202 19:07:24.875565   49088 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 19:07:24.882887   49088 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1202 19:07:24.882908   49088 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1202 19:07:24.882928   49088 command_runner.go:130] > /var/lib/minikube/etcd:
	I1202 19:07:24.883961   49088 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1202 19:07:24.884012   49088 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1202 19:07:24.884084   49088 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1202 19:07:24.891632   49088 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:07:24.892026   49088 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-449836" does not appear in /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.892129   49088 kubeconfig.go:62] /home/jenkins/minikube-integration/22021-2487/kubeconfig needs updating (will repair): [kubeconfig missing "functional-449836" cluster setting kubeconfig missing "functional-449836" context setting]
	I1202 19:07:24.892546   49088 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.892988   49088 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.893140   49088 kapi.go:59] client config for functional-449836: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 19:07:24.893652   49088 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1202 19:07:24.893721   49088 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1202 19:07:24.893742   49088 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1202 19:07:24.893817   49088 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1202 19:07:24.893840   49088 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1202 19:07:24.893879   49088 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1202 19:07:24.894204   49088 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1202 19:07:24.902267   49088 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1202 19:07:24.902298   49088 kubeadm.go:602] duration metric: took 18.265587ms to restartPrimaryControlPlane
	I1202 19:07:24.902309   49088 kubeadm.go:403] duration metric: took 56.993765ms to StartCluster
	I1202 19:07:24.902355   49088 settings.go:142] acquiring lock: {Name:mka76ea0dcf16fdbb68808885f8360c0083029b4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.902437   49088 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.903036   49088 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.903251   49088 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1202 19:07:24.903573   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:24.903617   49088 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1202 19:07:24.903676   49088 addons.go:70] Setting storage-provisioner=true in profile "functional-449836"
	I1202 19:07:24.903691   49088 addons.go:239] Setting addon storage-provisioner=true in "functional-449836"
	I1202 19:07:24.903717   49088 host.go:66] Checking if "functional-449836" exists ...
	I1202 19:07:24.903830   49088 addons.go:70] Setting default-storageclass=true in profile "functional-449836"
	I1202 19:07:24.903877   49088 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-449836"
	I1202 19:07:24.904207   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.904250   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.909664   49088 out.go:179] * Verifying Kubernetes components...
	I1202 19:07:24.912752   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:24.942660   49088 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 19:07:24.943205   49088 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.943381   49088 kapi.go:59] client config for functional-449836: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 19:07:24.943666   49088 addons.go:239] Setting addon default-storageclass=true in "functional-449836"
	I1202 19:07:24.943695   49088 host.go:66] Checking if "functional-449836" exists ...
	I1202 19:07:24.944105   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.945588   49088 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:24.945617   49088 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1202 19:07:24.945676   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:24.976744   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:24.983018   49088 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:24.983040   49088 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1202 19:07:24.983109   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:25.013238   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:25.139303   49088 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:07:25.147308   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:25.166870   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:25.922715   49088 node_ready.go:35] waiting up to 6m0s for node "functional-449836" to be "Ready" ...
	I1202 19:07:25.922842   49088 type.go:168] "Request Body" body=""
	I1202 19:07:25.922904   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:25.923137   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:25.923161   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923181   49088 retry.go:31] will retry after 314.802872ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923212   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:25.923227   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923235   49088 retry.go:31] will retry after 316.161686ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923297   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:26.238968   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:26.239458   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:26.312262   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.312301   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.312346   49088 retry.go:31] will retry after 358.686092ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.320393   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.320484   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.320525   49088 retry.go:31] will retry after 528.121505ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.423804   49088 type.go:168] "Request Body" body=""
	I1202 19:07:26.423895   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:26.424214   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:26.671815   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:26.745439   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.745497   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.745515   49088 retry.go:31] will retry after 446.477413ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.849789   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:26.909069   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.909108   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.909134   49088 retry.go:31] will retry after 684.877567ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.923341   49088 type.go:168] "Request Body" body=""
	I1202 19:07:26.923433   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:26.923791   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:27.192236   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:27.247207   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:27.250502   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.250546   49088 retry.go:31] will retry after 797.707708ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.423790   49088 type.go:168] "Request Body" body=""
	I1202 19:07:27.423889   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:27.424196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:27.594774   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:27.660877   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:27.660957   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.660987   49088 retry.go:31] will retry after 601.48037ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.923401   49088 type.go:168] "Request Body" body=""
	I1202 19:07:27.923475   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:27.923784   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:27.923848   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:28.049160   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:28.112455   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:28.112493   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.112512   49088 retry.go:31] will retry after 941.564206ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.262919   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:28.323250   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:28.323307   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.323325   49088 retry.go:31] will retry after 741.834409ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.423555   49088 type.go:168] "Request Body" body=""
	I1202 19:07:28.423652   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:28.423971   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:28.923658   49088 type.go:168] "Request Body" body=""
	I1202 19:07:28.923731   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:28.923989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:29.054311   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:29.065740   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:29.126744   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:29.126791   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.126812   49088 retry.go:31] will retry after 2.378740888s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.143543   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:29.143609   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.143631   49088 retry.go:31] will retry after 2.739062704s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:07:29.423043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:29.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:29.923116   49088 type.go:168] "Request Body" body=""
	I1202 19:07:29.923203   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:29.923554   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:30.422925   49088 type.go:168] "Request Body" body=""
	I1202 19:07:30.423004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:30.423294   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:30.423351   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:30.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:07:30.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:30.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:07:31.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:31.423376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.506668   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:31.565098   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:31.565149   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.565168   49088 retry.go:31] will retry after 3.30231188s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.883619   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:31.923118   49088 type.go:168] "Request Body" body=""
	I1202 19:07:31.923190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:31.923483   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.949881   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:31.953682   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.953716   49088 retry.go:31] will retry after 2.323480137s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:32.422997   49088 type.go:168] "Request Body" body=""
	I1202 19:07:32.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:32.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:32.423441   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:32.923115   49088 type.go:168] "Request Body" body=""
	I1202 19:07:32.923193   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:32.923525   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:33.422891   49088 type.go:168] "Request Body" body=""
	I1202 19:07:33.422956   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:33.423209   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:33.922911   49088 type.go:168] "Request Body" body=""
	I1202 19:07:33.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:33.923302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:34.277557   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:34.337253   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:34.337306   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.337326   49088 retry.go:31] will retry after 5.941517157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.423656   49088 type.go:168] "Request Body" body=""
	I1202 19:07:34.423738   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:34.424084   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:34.424136   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:34.867735   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:34.923406   49088 type.go:168] "Request Body" body=""
	I1202 19:07:34.923506   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:34.923762   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:34.931582   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:34.931622   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.931641   49088 retry.go:31] will retry after 5.732328972s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:35.423000   49088 type.go:168] "Request Body" body=""
	I1202 19:07:35.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:35.423385   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:35.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:07:35.923063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:35.923444   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:36.422994   49088 type.go:168] "Request Body" body=""
	I1202 19:07:36.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:36.423390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:36.922999   49088 type.go:168] "Request Body" body=""
	I1202 19:07:36.923077   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:36.923397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:36.923453   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:37.423120   49088 type.go:168] "Request Body" body=""
	I1202 19:07:37.423196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:37.423525   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:37.923076   49088 type.go:168] "Request Body" body=""
	I1202 19:07:37.923148   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:37.923450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:38.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:07:38.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:38.423432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:38.923000   49088 type.go:168] "Request Body" body=""
	I1202 19:07:38.923074   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:38.923410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:39.423757   49088 type.go:168] "Request Body" body=""
	I1202 19:07:39.423827   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:39.424076   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:39.424115   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:39.923939   49088 type.go:168] "Request Body" body=""
	I1202 19:07:39.924013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:39.924357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:40.279081   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:40.340610   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:40.340655   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.340674   49088 retry.go:31] will retry after 7.832295728s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:07:40.423959   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:40.424241   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:40.664676   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:40.720825   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:40.724043   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.724077   49088 retry.go:31] will retry after 3.410570548s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.923400   49088 type.go:168] "Request Body" body=""
	I1202 19:07:40.923497   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:40.923882   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:41.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:07:41.423784   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:41.424115   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:41.424172   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:41.922915   49088 type.go:168] "Request Body" body=""
	I1202 19:07:41.922990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:41.923344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:42.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:07:42.422980   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:42.423254   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:42.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:07:42.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:42.923341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:43.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:07:43.423067   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:43.423429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:43.923715   49088 type.go:168] "Request Body" body=""
	I1202 19:07:43.923780   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:43.924087   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:43.924145   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:44.135480   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:44.194407   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:44.194462   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:44.194482   49088 retry.go:31] will retry after 9.43511002s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:44.423808   49088 type.go:168] "Request Body" body=""
	I1202 19:07:44.423884   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:44.424207   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:44.923173   49088 type.go:168] "Request Body" body=""
	I1202 19:07:44.923287   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:44.923608   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:45.423511   49088 type.go:168] "Request Body" body=""
	I1202 19:07:45.423594   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:45.423852   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:45.923682   49088 type.go:168] "Request Body" body=""
	I1202 19:07:45.923754   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:45.924062   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:46.423867   49088 type.go:168] "Request Body" body=""
	I1202 19:07:46.423945   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:46.424267   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:46.424344   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:46.923022   49088 type.go:168] "Request Body" body=""
	I1202 19:07:46.923087   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:46.923335   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:47.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:07:47.423049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:47.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:47.922938   49088 type.go:168] "Request Body" body=""
	I1202 19:07:47.923018   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:47.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:48.173817   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:48.233696   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:48.233741   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:48.233760   49088 retry.go:31] will retry after 11.915058211s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:48.423860   49088 type.go:168] "Request Body" body=""
	I1202 19:07:48.423931   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:48.424192   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:48.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:07:48.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:48.923338   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:48.923389   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:49.423071   49088 type.go:168] "Request Body" body=""
	I1202 19:07:49.423160   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:49.423457   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:49.923767   49088 type.go:168] "Request Body" body=""
	I1202 19:07:49.923839   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:49.924094   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:50.423628   49088 type.go:168] "Request Body" body=""
	I1202 19:07:50.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:50.424008   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:50.923823   49088 type.go:168] "Request Body" body=""
	I1202 19:07:50.923896   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:50.924199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:50.924253   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:51.423846   49088 type.go:168] "Request Body" body=""
	I1202 19:07:51.423915   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:51.424234   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:51.922982   49088 type.go:168] "Request Body" body=""
	I1202 19:07:51.923118   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:51.923450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:52.423137   49088 type.go:168] "Request Body" body=""
	I1202 19:07:52.423209   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:52.423568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:52.923553   49088 type.go:168] "Request Body" body=""
	I1202 19:07:52.923629   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:52.923890   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:53.423671   49088 type.go:168] "Request Body" body=""
	I1202 19:07:53.423751   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:53.424089   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:53.424151   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:53.630602   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:53.701195   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:53.708777   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:53.708825   49088 retry.go:31] will retry after 18.228322251s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:53.923261   49088 type.go:168] "Request Body" body=""
	I1202 19:07:53.923336   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:53.923674   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:54.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:07:54.422976   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:54.423235   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:54.923162   49088 type.go:168] "Request Body" body=""
	I1202 19:07:54.923249   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:54.923575   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:55.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:07:55.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:55.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:55.922912   49088 type.go:168] "Request Body" body=""
	I1202 19:07:55.922984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:55.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:55.923346   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:56.422984   49088 type.go:168] "Request Body" body=""
	I1202 19:07:56.423058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:56.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:56.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:07:56.923062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:56.923429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:57.423124   49088 type.go:168] "Request Body" body=""
	I1202 19:07:57.423196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:57.423456   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:57.922920   49088 type.go:168] "Request Body" body=""
	I1202 19:07:57.922991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:57.923317   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:57.923373   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:58.422950   49088 type.go:168] "Request Body" body=""
	I1202 19:07:58.423033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:58.423366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:58.923630   49088 type.go:168] "Request Body" body=""
	I1202 19:07:58.923696   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:58.924020   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:59.423807   49088 type.go:168] "Request Body" body=""
	I1202 19:07:59.423887   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:59.424243   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:59.922942   49088 type.go:168] "Request Body" body=""
	I1202 19:07:59.923022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:59.923353   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:59.923410   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:00.150075   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:00.323059   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:00.323111   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:00.323132   49088 retry.go:31] will retry after 12.256345503s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:00.423512   49088 type.go:168] "Request Body" body=""
	I1202 19:08:00.423597   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:00.423977   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:00.923784   49088 type.go:168] "Request Body" body=""
	I1202 19:08:00.923865   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:00.924196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:01.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:08:01.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:01.423304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:01.922933   49088 type.go:168] "Request Body" body=""
	I1202 19:08:01.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:01.923287   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:02.422978   49088 type.go:168] "Request Body" body=""
	I1202 19:08:02.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:02.423379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:02.423436   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:02.923122   49088 type.go:168] "Request Body" body=""
	I1202 19:08:02.923245   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:02.923555   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:03.423814   49088 type.go:168] "Request Body" body=""
	I1202 19:08:03.423889   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:03.424141   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:03.923895   49088 type.go:168] "Request Body" body=""
	I1202 19:08:03.923996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:03.924288   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:04.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:08:04.423083   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:04.423375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:04.922988   49088 type.go:168] "Request Body" body=""
	I1202 19:08:04.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:04.923336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:04.923376   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:05.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:08:05.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:05.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:05.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:08:05.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:05.923359   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:06.423783   49088 type.go:168] "Request Body" body=""
	I1202 19:08:06.423854   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:06.424112   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:06.923877   49088 type.go:168] "Request Body" body=""
	I1202 19:08:06.923974   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:06.924303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:06.924381   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:07.423044   49088 type.go:168] "Request Body" body=""
	I1202 19:08:07.423125   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:07.423474   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:07.922856   49088 type.go:168] "Request Body" body=""
	I1202 19:08:07.922930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:07.923205   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:08.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:08.422992   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:08.423315   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:08.922886   49088 type.go:168] "Request Body" body=""
	I1202 19:08:08.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:08.923313   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:09.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:09.423006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:09.423295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:09.423343   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:09.922894   49088 type.go:168] "Request Body" body=""
	I1202 19:08:09.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:09.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:10.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:08:10.423153   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:10.423491   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:10.923741   49088 type.go:168] "Request Body" body=""
	I1202 19:08:10.923814   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:10.924073   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:11.423834   49088 type.go:168] "Request Body" body=""
	I1202 19:08:11.423907   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:11.424248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:11.424304   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:11.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:08:11.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:11.923342   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:11.937687   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:08:11.996748   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:11.999800   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:11.999828   49088 retry.go:31] will retry after 12.016513449s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.423502   49088 type.go:168] "Request Body" body=""
	I1202 19:08:12.423582   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:12.423831   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:12.580354   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:12.637408   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:12.637456   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.637477   49088 retry.go:31] will retry after 30.215930355s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.923948   49088 type.go:168] "Request Body" body=""
	I1202 19:08:12.924043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:12.924384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:13.422995   49088 type.go:168] "Request Body" body=""
	I1202 19:08:13.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:13.423402   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:13.923854   49088 type.go:168] "Request Body" body=""
	I1202 19:08:13.923924   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:13.924172   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:13.924221   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:14.422931   49088 type.go:168] "Request Body" body=""
	I1202 19:08:14.423013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:14.423360   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:14.923106   49088 type.go:168] "Request Body" body=""
	I1202 19:08:14.923201   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:14.923504   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:15.423455   49088 type.go:168] "Request Body" body=""
	I1202 19:08:15.423543   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:15.423801   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:15.923582   49088 type.go:168] "Request Body" body=""
	I1202 19:08:15.923658   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:15.923982   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:16.423696   49088 type.go:168] "Request Body" body=""
	I1202 19:08:16.423768   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:16.424069   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:16.424123   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:16.923441   49088 type.go:168] "Request Body" body=""
	I1202 19:08:16.923513   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:16.923823   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:17.423623   49088 type.go:168] "Request Body" body=""
	I1202 19:08:17.423715   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:17.424059   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:17.923916   49088 type.go:168] "Request Body" body=""
	I1202 19:08:17.923987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:17.924303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:18.422927   49088 type.go:168] "Request Body" body=""
	I1202 19:08:18.423013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:18.423293   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:18.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:08:18.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:18.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:18.923511   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:19.423204   49088 type.go:168] "Request Body" body=""
	I1202 19:08:19.423280   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:19.423633   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:19.923322   49088 type.go:168] "Request Body" body=""
	I1202 19:08:19.923392   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:19.923647   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:20.423776   49088 type.go:168] "Request Body" body=""
	I1202 19:08:20.423870   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:20.424201   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:20.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:08:20.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:20.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:21.423693   49088 type.go:168] "Request Body" body=""
	I1202 19:08:21.423801   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:21.424068   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:21.424117   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:21.923863   49088 type.go:168] "Request Body" body=""
	I1202 19:08:21.923935   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:21.924262   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:22.422993   49088 type.go:168] "Request Body" body=""
	I1202 19:08:22.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:22.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:22.922927   49088 type.go:168] "Request Body" body=""
	I1202 19:08:22.922998   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:22.923323   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:23.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:08:23.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:23.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:23.922971   49088 type.go:168] "Request Body" body=""
	I1202 19:08:23.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:23.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:23.923391   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:24.016567   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:08:24.078750   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:24.078790   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:24.078809   49088 retry.go:31] will retry after 37.473532818s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:24.423149   49088 type.go:168] "Request Body" body=""
	I1202 19:08:24.423225   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:24.423606   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:24.923585   49088 type.go:168] "Request Body" body=""
	I1202 19:08:24.923686   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:24.924015   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:25.423855   49088 type.go:168] "Request Body" body=""
	I1202 19:08:25.423933   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:25.424248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:25.923542   49088 type.go:168] "Request Body" body=""
	I1202 19:08:25.923615   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:25.923871   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:25.923923   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:26.423702   49088 type.go:168] "Request Body" body=""
	I1202 19:08:26.423799   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:26.424100   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:26.923908   49088 type.go:168] "Request Body" body=""
	I1202 19:08:26.923990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:26.924357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:27.422916   49088 type.go:168] "Request Body" body=""
	I1202 19:08:27.423000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:27.423295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:27.922995   49088 type.go:168] "Request Body" body=""
	I1202 19:08:27.923085   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:27.923386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:28.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:08:28.423198   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:28.423550   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:28.423605   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:28.923207   49088 type.go:168] "Request Body" body=""
	I1202 19:08:28.923280   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:28.923547   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:29.423229   49088 type.go:168] "Request Body" body=""
	I1202 19:08:29.423310   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:29.423621   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:29.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:08:29.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:29.923334   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:30.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:08:30.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:30.423290   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:30.922992   49088 type.go:168] "Request Body" body=""
	I1202 19:08:30.923070   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:30.923374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:30.923423   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:31.422962   49088 type.go:168] "Request Body" body=""
	I1202 19:08:31.423044   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:31.423370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:31.923658   49088 type.go:168] "Request Body" body=""
	I1202 19:08:31.923727   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:31.923984   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:32.423783   49088 type.go:168] "Request Body" body=""
	I1202 19:08:32.423853   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:32.424194   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:32.923864   49088 type.go:168] "Request Body" body=""
	I1202 19:08:32.923952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:32.924274   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:32.924361   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:33.422870   49088 type.go:168] "Request Body" body=""
	I1202 19:08:33.422935   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:33.423233   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:33.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:08:33.923021   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:33.923340   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:34.423053   49088 type.go:168] "Request Body" body=""
	I1202 19:08:34.423137   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:34.423440   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:34.923286   49088 type.go:168] "Request Body" body=""
	I1202 19:08:34.923353   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:34.923610   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:35.423738   49088 type.go:168] "Request Body" body=""
	I1202 19:08:35.423811   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:35.424122   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:35.424178   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:35.923982   49088 type.go:168] "Request Body" body=""
	I1202 19:08:35.924054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:35.924397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:36.423490   49088 type.go:168] "Request Body" body=""
	I1202 19:08:36.423577   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:36.423904   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:36.923609   49088 type.go:168] "Request Body" body=""
	I1202 19:08:36.923698   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:36.924003   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:37.423836   49088 type.go:168] "Request Body" body=""
	I1202 19:08:37.423908   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:37.424273   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:37.424347   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:37.923878   49088 type.go:168] "Request Body" body=""
	I1202 19:08:37.923949   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:37.924222   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:38.422922   49088 type.go:168] "Request Body" body=""
	I1202 19:08:38.422995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:38.423329   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:38.923060   49088 type.go:168] "Request Body" body=""
	I1202 19:08:38.923133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:38.923451   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:39.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:08:39.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:39.423240   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:39.922980   49088 type.go:168] "Request Body" body=""
	I1202 19:08:39.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:39.923354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:39.923403   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:40.423331   49088 type.go:168] "Request Body" body=""
	I1202 19:08:40.423423   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:40.423754   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:40.923541   49088 type.go:168] "Request Body" body=""
	I1202 19:08:40.923652   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:40.923952   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:41.423758   49088 type.go:168] "Request Body" body=""
	I1202 19:08:41.423829   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:41.424159   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:41.922904   49088 type.go:168] "Request Body" body=""
	I1202 19:08:41.922987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:41.923363   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:42.423568   49088 type.go:168] "Request Body" body=""
	I1202 19:08:42.423637   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:42.423879   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:42.423921   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:42.854609   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:42.913285   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:42.916268   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:42.916300   49088 retry.go:31] will retry after 24.794449401s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:42.923470   49088 type.go:168] "Request Body" body=""
	I1202 19:08:42.923553   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:42.923860   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:43.423622   49088 type.go:168] "Request Body" body=""
	I1202 19:08:43.423694   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:43.423983   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:43.923751   49088 type.go:168] "Request Body" body=""
	I1202 19:08:43.923834   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:43.924123   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:44.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:44.423014   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:44.423327   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:44.923006   49088 type.go:168] "Request Body" body=""
	I1202 19:08:44.923080   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:44.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:44.923476   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:45.422929   49088 type.go:168] "Request Body" body=""
	I1202 19:08:45.423000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:45.423274   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:45.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:08:45.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:45.923364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:46.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:08:46.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:46.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:46.922941   49088 type.go:168] "Request Body" body=""
	I1202 19:08:46.923013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:46.923277   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:47.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:08:47.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:47.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:47.423440   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:47.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:08:47.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:47.923383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:48.423521   49088 type.go:168] "Request Body" body=""
	I1202 19:08:48.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:48.424010   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:48.923782   49088 type.go:168] "Request Body" body=""
	I1202 19:08:48.923856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:48.924186   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:49.422919   49088 type.go:168] "Request Body" body=""
	I1202 19:08:49.422992   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:49.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:49.922931   49088 type.go:168] "Request Body" body=""
	I1202 19:08:49.923004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:49.923306   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:49.923362   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:50.423001   49088 type.go:168] "Request Body" body=""
	I1202 19:08:50.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:50.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:50.922993   49088 type.go:168] "Request Body" body=""
	I1202 19:08:50.923072   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:50.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:51.423082   49088 type.go:168] "Request Body" body=""
	I1202 19:08:51.423174   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:51.423497   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:51.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:08:51.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:51.923335   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:51.923382   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:52.423062   49088 type.go:168] "Request Body" body=""
	I1202 19:08:52.423141   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:52.423469   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:52.922906   49088 type.go:168] "Request Body" body=""
	I1202 19:08:52.922982   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:52.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:53.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:08:53.423074   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:53.423405   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:53.923177   49088 type.go:168] "Request Body" body=""
	I1202 19:08:53.923253   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:53.923577   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:53.923645   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:54.422880   49088 type.go:168] "Request Body" body=""
	I1202 19:08:54.422958   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:54.423220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:54.923069   49088 type.go:168] "Request Body" body=""
	I1202 19:08:54.923142   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:54.923466   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:55.423373   49088 type.go:168] "Request Body" body=""
	I1202 19:08:55.423459   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:55.423806   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:55.923603   49088 type.go:168] "Request Body" body=""
	I1202 19:08:55.923681   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:55.923944   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:55.923992   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:56.423750   49088 type.go:168] "Request Body" body=""
	I1202 19:08:56.423821   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:56.424196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:56.922939   49088 type.go:168] "Request Body" body=""
	I1202 19:08:56.923015   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:56.923399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:57.423718   49088 type.go:168] "Request Body" body=""
	I1202 19:08:57.423789   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:57.424085   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:57.923903   49088 type.go:168] "Request Body" body=""
	I1202 19:08:57.923980   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:57.924302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:57.924374   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:58.422958   49088 type.go:168] "Request Body" body=""
	I1202 19:08:58.423030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:58.423367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:58.923777   49088 type.go:168] "Request Body" body=""
	I1202 19:08:58.923851   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:58.924127   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:59.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:08:59.423956   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:59.424305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:59.922904   49088 type.go:168] "Request Body" body=""
	I1202 19:08:59.922978   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:59.923298   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:00.423245   49088 type.go:168] "Request Body" body=""
	I1202 19:09:00.423318   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:00.423619   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:00.423665   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:00.922984   49088 type.go:168] "Request Body" body=""
	I1202 19:09:00.923063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:00.923384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:01.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:09:01.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:01.423390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:01.552630   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:09:01.616821   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:01.616872   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:01.616967   49088 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1202 19:09:01.923268   49088 type.go:168] "Request Body" body=""
	I1202 19:09:01.923333   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:01.923595   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:02.423036   49088 type.go:168] "Request Body" body=""
	I1202 19:09:02.423106   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:02.423425   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:02.922957   49088 type.go:168] "Request Body" body=""
	I1202 19:09:02.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:02.923349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:02.923411   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:03.422870   49088 type.go:168] "Request Body" body=""
	I1202 19:09:03.422937   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:03.423202   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:03.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:03.923048   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:03.923382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:04.422966   49088 type.go:168] "Request Body" body=""
	I1202 19:09:04.423045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:04.423360   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:04.923022   49088 type.go:168] "Request Body" body=""
	I1202 19:09:04.923097   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:04.923357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:05.423319   49088 type.go:168] "Request Body" body=""
	I1202 19:09:05.423392   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:05.423740   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:05.423793   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:05.923326   49088 type.go:168] "Request Body" body=""
	I1202 19:09:05.923409   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:05.923718   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:06.423454   49088 type.go:168] "Request Body" body=""
	I1202 19:09:06.423525   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:06.423826   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:06.923640   49088 type.go:168] "Request Body" body=""
	I1202 19:09:06.923716   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:06.924092   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:07.423755   49088 type.go:168] "Request Body" body=""
	I1202 19:09:07.423833   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:07.424174   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:07.424240   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:07.711667   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:09:07.768083   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:07.771273   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:07.771371   49088 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1202 19:09:07.774489   49088 out.go:179] * Enabled addons: 
	I1202 19:09:07.778178   49088 addons.go:530] duration metric: took 1m42.874553995s for enable addons: enabled=[]
	I1202 19:09:07.923592   49088 type.go:168] "Request Body" body=""
	I1202 19:09:07.923663   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:07.923975   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:08.423753   49088 type.go:168] "Request Body" body=""
	I1202 19:09:08.423867   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:08.424222   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:08.922931   49088 type.go:168] "Request Body" body=""
	I1202 19:09:08.923003   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:08.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:09.423790   49088 type.go:168] "Request Body" body=""
	I1202 19:09:09.423880   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:09.424153   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:09.922907   49088 type.go:168] "Request Body" body=""
	I1202 19:09:09.923001   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:09.923324   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:09.923374   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:10.423145   49088 type.go:168] "Request Body" body=""
	I1202 19:09:10.423260   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:10.423579   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:10.922932   49088 type.go:168] "Request Body" body=""
	I1202 19:09:10.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:10.923400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:11.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:09:11.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:11.423397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:11.923082   49088 type.go:168] "Request Body" body=""
	I1202 19:09:11.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:11.923464   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:11.923521   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:12.422896   49088 type.go:168] "Request Body" body=""
	I1202 19:09:12.422975   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:12.423250   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:12.922964   49088 type.go:168] "Request Body" body=""
	I1202 19:09:12.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:12.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:13.423093   49088 type.go:168] "Request Body" body=""
	I1202 19:09:13.423175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:13.423500   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:13.923207   49088 type.go:168] "Request Body" body=""
	I1202 19:09:13.923272   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:13.923535   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:13.923574   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:14.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:14.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:14.423377   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:14.923293   49088 type.go:168] "Request Body" body=""
	I1202 19:09:14.923367   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:14.923688   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:15.423514   49088 type.go:168] "Request Body" body=""
	I1202 19:09:15.423584   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:15.423882   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:15.923633   49088 type.go:168] "Request Body" body=""
	I1202 19:09:15.923702   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:15.924013   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:15.924083   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:16.423892   49088 type.go:168] "Request Body" body=""
	I1202 19:09:16.423994   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:16.424346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:16.922929   49088 type.go:168] "Request Body" body=""
	I1202 19:09:16.922996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:16.923246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:17.422959   49088 type.go:168] "Request Body" body=""
	I1202 19:09:17.423030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:17.423344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:17.923012   49088 type.go:168] "Request Body" body=""
	I1202 19:09:17.923112   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:17.923445   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:18.423579   49088 type.go:168] "Request Body" body=""
	I1202 19:09:18.423646   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:18.423912   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:18.423954   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:18.923689   49088 type.go:168] "Request Body" body=""
	I1202 19:09:18.923816   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:18.924164   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:19.423865   49088 type.go:168] "Request Body" body=""
	I1202 19:09:19.423938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:19.424264   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:19.922971   49088 type.go:168] "Request Body" body=""
	I1202 19:09:19.923065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:19.928773   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1202 19:09:20.423719   49088 type.go:168] "Request Body" body=""
	I1202 19:09:20.423798   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:20.424108   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:20.424158   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:20.923797   49088 type.go:168] "Request Body" body=""
	I1202 19:09:20.923876   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:20.924234   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:21.423890   49088 type.go:168] "Request Body" body=""
	I1202 19:09:21.423990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:21.424292   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:21.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:09:21.923044   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:21.923382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:22.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:22.423043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:22.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:22.923067   49088 type.go:168] "Request Body" body=""
	I1202 19:09:22.923150   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:22.923449   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:22.923501   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:23.423230   49088 type.go:168] "Request Body" body=""
	I1202 19:09:23.423312   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:23.423745   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:23.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:23.923037   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:23.923364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:24.423610   49088 type.go:168] "Request Body" body=""
	I1202 19:09:24.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:24.423973   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:24.922875   49088 type.go:168] "Request Body" body=""
	I1202 19:09:24.922950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:24.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:25.422971   49088 type.go:168] "Request Body" body=""
	I1202 19:09:25.423045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:25.423397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:25.423453   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:25.923781   49088 type.go:168] "Request Body" body=""
	I1202 19:09:25.923849   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:25.924111   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:26.423856   49088 type.go:168] "Request Body" body=""
	I1202 19:09:26.423928   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:26.424242   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:26.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:26.923039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:26.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:27.423069   49088 type.go:168] "Request Body" body=""
	I1202 19:09:27.423144   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:27.423407   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:27.922946   49088 type.go:168] "Request Body" body=""
	I1202 19:09:27.923018   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:27.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:27.923411   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:28.423076   49088 type.go:168] "Request Body" body=""
	I1202 19:09:28.423152   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:28.423495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:28.923840   49088 type.go:168] "Request Body" body=""
	I1202 19:09:28.923913   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:28.924219   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:29.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:09:29.422989   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:29.423326   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:29.923042   49088 type.go:168] "Request Body" body=""
	I1202 19:09:29.923119   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:29.923480   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:29.923538   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:30.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:09:30.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:30.423371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:30.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:30.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:30.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:31.423083   49088 type.go:168] "Request Body" body=""
	I1202 19:09:31.423155   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:31.423507   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:31.922881   49088 type.go:168] "Request Body" body=""
	I1202 19:09:31.922954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:31.923312   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:32.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:09:32.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:32.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:32.423472   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:32.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:09:32.923050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:32.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:33.423100   49088 type.go:168] "Request Body" body=""
	I1202 19:09:33.423172   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:33.423473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:33.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:09:33.923039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:33.923379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:34.423096   49088 type.go:168] "Request Body" body=""
	I1202 19:09:34.423177   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:34.423484   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:34.423532   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:34.923381   49088 type.go:168] "Request Body" body=""
	I1202 19:09:34.923452   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:34.923763   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:35.423619   49088 type.go:168] "Request Body" body=""
	I1202 19:09:35.423698   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:35.424059   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:35.923863   49088 type.go:168] "Request Body" body=""
	I1202 19:09:35.923933   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:35.924297   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:36.423817   49088 type.go:168] "Request Body" body=""
	I1202 19:09:36.423883   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:36.424153   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:36.424193   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:36.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:09:36.922964   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:36.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:37.422984   49088 type.go:168] "Request Body" body=""
	I1202 19:09:37.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:37.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:37.922914   49088 type.go:168] "Request Body" body=""
	I1202 19:09:37.922987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:37.923253   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:38.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:09:38.423020   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:38.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:38.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:38.923084   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:38.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:38.923459   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:39.423118   49088 type.go:168] "Request Body" body=""
	I1202 19:09:39.423188   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:39.423443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:39.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:09:39.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:39.923390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:40.422957   49088 type.go:168] "Request Body" body=""
	I1202 19:09:40.423031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:40.423438   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:40.922909   49088 type.go:168] "Request Body" body=""
	I1202 19:09:40.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:40.923328   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:41.422954   49088 type.go:168] "Request Body" body=""
	I1202 19:09:41.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:41.423387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:41.423450   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:41.923108   49088 type.go:168] "Request Body" body=""
	I1202 19:09:41.923187   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:41.923536   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:42.423214   49088 type.go:168] "Request Body" body=""
	I1202 19:09:42.423293   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:42.423567   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:42.922993   49088 type.go:168] "Request Body" body=""
	I1202 19:09:42.923073   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:42.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:43.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:09:43.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:43.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:43.923097   49088 type.go:168] "Request Body" body=""
	I1202 19:09:43.923183   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:43.923554   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:43.923610   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:44.422960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:44.423031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:44.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:44.923133   49088 type.go:168] "Request Body" body=""
	I1202 19:09:44.923217   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:44.923568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:45.422996   49088 type.go:168] "Request Body" body=""
	I1202 19:09:45.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:45.423325   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:45.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:09:45.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:45.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:46.422965   49088 type.go:168] "Request Body" body=""
	I1202 19:09:46.423041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:46.423338   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:46.423383   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:46.923662   49088 type.go:168] "Request Body" body=""
	I1202 19:09:46.923729   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:46.923996   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:47.423794   49088 type.go:168] "Request Body" body=""
	I1202 19:09:47.423868   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:47.424199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:47.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:09:47.922970   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:47.923290   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:48.423862   49088 type.go:168] "Request Body" body=""
	I1202 19:09:48.423930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:48.424197   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:48.424242   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:48.922882   49088 type.go:168] "Request Body" body=""
	I1202 19:09:48.922964   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:48.923305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:49.422874   49088 type.go:168] "Request Body" body=""
	I1202 19:09:49.422952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:49.423501   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:49.923227   49088 type.go:168] "Request Body" body=""
	I1202 19:09:49.923298   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:49.923571   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:50.423530   49088 type.go:168] "Request Body" body=""
	I1202 19:09:50.423605   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:50.423930   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:50.923709   49088 type.go:168] "Request Body" body=""
	I1202 19:09:50.923791   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:50.924129   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:50.924183   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:51.423574   49088 type.go:168] "Request Body" body=""
	I1202 19:09:51.423645   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:51.423989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:51.923773   49088 type.go:168] "Request Body" body=""
	I1202 19:09:51.923846   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:51.924175   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:52.423792   49088 type.go:168] "Request Body" body=""
	I1202 19:09:52.423865   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:52.424194   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:52.923791   49088 type.go:168] "Request Body" body=""
	I1202 19:09:52.923863   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:52.924133   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:53.423850   49088 type.go:168] "Request Body" body=""
	I1202 19:09:53.423929   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:53.424252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:53.424366   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:53.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:09:53.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:53.923376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:54.422922   49088 type.go:168] "Request Body" body=""
	I1202 19:09:54.422999   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:54.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:54.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:09:54.923175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:54.923548   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:55.422950   49088 type.go:168] "Request Body" body=""
	I1202 19:09:55.423041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:55.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:55.923676   49088 type.go:168] "Request Body" body=""
	I1202 19:09:55.923766   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:55.924024   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:55.924066   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:56.423823   49088 type.go:168] "Request Body" body=""
	I1202 19:09:56.423900   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:56.424217   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:56.922947   49088 type.go:168] "Request Body" body=""
	I1202 19:09:56.923022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:56.923366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:57.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:09:57.422984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:57.423240   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:57.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:09:57.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:57.923361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:58.422959   49088 type.go:168] "Request Body" body=""
	I1202 19:09:58.423033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:58.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:58.423443   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:58.923738   49088 type.go:168] "Request Body" body=""
	I1202 19:09:58.923812   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:58.924072   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:59.423859   49088 type.go:168] "Request Body" body=""
	I1202 19:09:59.423937   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:59.424270   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:59.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:09:59.923052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:59.923398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:00.435478   49088 type.go:168] "Request Body" body=""
	I1202 19:10:00.435562   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:00.435862   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:00.435913   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:00.923635   49088 type.go:168] "Request Body" body=""
	I1202 19:10:00.923715   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:00.924056   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:01.423705   49088 type.go:168] "Request Body" body=""
	I1202 19:10:01.423779   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:01.424081   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:01.923812   49088 type.go:168] "Request Body" body=""
	I1202 19:10:01.923878   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:01.924156   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:02.423889   49088 type.go:168] "Request Body" body=""
	I1202 19:10:02.423969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:02.424269   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:02.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:02.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:02.923443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:02.923500   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:03.423151   49088 type.go:168] "Request Body" body=""
	I1202 19:10:03.423226   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:03.423486   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:03.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:10:03.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:03.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:04.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:10:04.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:04.423412   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:04.923351   49088 type.go:168] "Request Body" body=""
	I1202 19:10:04.923425   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:04.923691   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:04.923736   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:05.423841   49088 type.go:168] "Request Body" body=""
	I1202 19:10:05.423932   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:05.424309   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:05.922992   49088 type.go:168] "Request Body" body=""
	I1202 19:10:05.923078   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:05.923444   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:06.423123   49088 type.go:168] "Request Body" body=""
	I1202 19:10:06.423232   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:06.423495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:06.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:10:06.923050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:06.923408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:07.422988   49088 type.go:168] "Request Body" body=""
	I1202 19:10:07.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:07.423489   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:07.423547   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:07.923028   49088 type.go:168] "Request Body" body=""
	I1202 19:10:07.923107   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:07.923442   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:08.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:08.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:08.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:08.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:10:08.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:08.923365   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:09.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:10:09.423094   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:09.423439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:09.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:10:09.923062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:09.923410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:09.923465   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:10.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:10:10.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:10.423387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:10.922955   49088 type.go:168] "Request Body" body=""
	I1202 19:10:10.923025   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:10.923302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:11.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:11.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:11.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:11.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:10:11.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:11.923379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:12.422929   49088 type.go:168] "Request Body" body=""
	I1202 19:10:12.423003   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:12.423285   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:12.423327   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:12.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:10:12.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:12.923388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:13.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:13.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:13.423417   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:13.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:10:13.922997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:13.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:14.422948   49088 type.go:168] "Request Body" body=""
	I1202 19:10:14.423026   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:14.423361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:14.423418   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:14.923139   49088 type.go:168] "Request Body" body=""
	I1202 19:10:14.923221   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:14.923551   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:15.423337   49088 type.go:168] "Request Body" body=""
	I1202 19:10:15.423420   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:15.423733   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:15.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:10:15.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:15.923380   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:16.422956   49088 type.go:168] "Request Body" body=""
	I1202 19:10:16.423036   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:16.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:16.923752   49088 type.go:168] "Request Body" body=""
	I1202 19:10:16.923818   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:16.924073   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:16.924112   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:17.423826   49088 type.go:168] "Request Body" body=""
	I1202 19:10:17.423915   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:17.424256   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:17.922988   49088 type.go:168] "Request Body" body=""
	I1202 19:10:17.923068   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:17.923403   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:18.422876   49088 type.go:168] "Request Body" body=""
	I1202 19:10:18.422953   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:18.423220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:18.922913   49088 type.go:168] "Request Body" body=""
	I1202 19:10:18.922988   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:18.923326   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:19.423037   49088 type.go:168] "Request Body" body=""
	I1202 19:10:19.423111   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:19.423450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:19.423505   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:19.923780   49088 type.go:168] "Request Body" body=""
	I1202 19:10:19.923847   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:19.924112   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:20.423105   49088 type.go:168] "Request Body" body=""
	I1202 19:10:20.423212   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:20.423516   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:20.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:10:20.923060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:20.923378   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:21.423065   49088 type.go:168] "Request Body" body=""
	I1202 19:10:21.423140   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:21.423414   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:21.923020   49088 type.go:168] "Request Body" body=""
	I1202 19:10:21.923093   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:21.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:21.923415   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:22.423093   49088 type.go:168] "Request Body" body=""
	I1202 19:10:22.423166   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:22.423511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:22.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:10:22.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:22.923434   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:23.423060   49088 type.go:168] "Request Body" body=""
	I1202 19:10:23.423133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:23.423446   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:23.923195   49088 type.go:168] "Request Body" body=""
	I1202 19:10:23.923269   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:23.923577   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:23.923622   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:24.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:10:24.422957   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:24.423230   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:24.923115   49088 type.go:168] "Request Body" body=""
	I1202 19:10:24.923194   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:24.923600   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:25.423537   49088 type.go:168] "Request Body" body=""
	I1202 19:10:25.423610   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:25.423949   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:25.923703   49088 type.go:168] "Request Body" body=""
	I1202 19:10:25.923775   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:25.924043   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:25.924103   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:26.423902   49088 type.go:168] "Request Body" body=""
	I1202 19:10:26.423982   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:26.424292   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:26.922962   49088 type.go:168] "Request Body" body=""
	I1202 19:10:26.923073   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:26.923396   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:27.423706   49088 type.go:168] "Request Body" body=""
	I1202 19:10:27.423778   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:27.424090   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:27.923873   49088 type.go:168] "Request Body" body=""
	I1202 19:10:27.923954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:27.924307   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:27.924397   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:28.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:10:28.423017   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:28.423349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:28.922918   49088 type.go:168] "Request Body" body=""
	I1202 19:10:28.922988   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:28.923270   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:29.422990   49088 type.go:168] "Request Body" body=""
	I1202 19:10:29.423072   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:29.423426   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:29.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:10:29.923053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:29.923386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:30.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:10:30.423008   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:30.423306   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:30.423392   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:30.923054   49088 type.go:168] "Request Body" body=""
	I1202 19:10:30.923133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:30.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:31.423123   49088 type.go:168] "Request Body" body=""
	I1202 19:10:31.423203   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:31.423539   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:31.923853   49088 type.go:168] "Request Body" body=""
	I1202 19:10:31.923920   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:31.924180   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:32.422889   49088 type.go:168] "Request Body" body=""
	I1202 19:10:32.422970   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:32.423319   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:32.922910   49088 type.go:168] "Request Body" body=""
	I1202 19:10:32.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:32.923321   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:32.923375   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:33.423014   49088 type.go:168] "Request Body" body=""
	I1202 19:10:33.423095   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:33.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:33.923072   49088 type.go:168] "Request Body" body=""
	I1202 19:10:33.923148   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:33.923513   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:34.423108   49088 type.go:168] "Request Body" body=""
	I1202 19:10:34.423190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:34.423541   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:34.923286   49088 type.go:168] "Request Body" body=""
	I1202 19:10:34.923355   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:34.923630   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:34.923672   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:35.423752   49088 type.go:168] "Request Body" body=""
	I1202 19:10:35.423834   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:35.424190   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:35.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:10:35.922967   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:35.923295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:36.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:10:36.423054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:36.423305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:36.923037   49088 type.go:168] "Request Body" body=""
	I1202 19:10:36.923115   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:36.923442   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:37.423029   49088 type.go:168] "Request Body" body=""
	I1202 19:10:37.423102   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:37.423425   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:37.423480   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:37.923773   49088 type.go:168] "Request Body" body=""
	I1202 19:10:37.923861   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:37.924136   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:38.423899   49088 type.go:168] "Request Body" body=""
	I1202 19:10:38.423979   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:38.424296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:38.922970   49088 type.go:168] "Request Body" body=""
	I1202 19:10:38.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:38.923372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:39.423713   49088 type.go:168] "Request Body" body=""
	I1202 19:10:39.423780   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:39.424040   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:39.424081   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:39.923835   49088 type.go:168] "Request Body" body=""
	I1202 19:10:39.923908   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:39.924227   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:40.423209   49088 type.go:168] "Request Body" body=""
	I1202 19:10:40.423286   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:40.423612   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:40.923000   49088 type.go:168] "Request Body" body=""
	I1202 19:10:40.923083   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:40.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:41.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:41.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:41.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:41.922974   49088 type.go:168] "Request Body" body=""
	I1202 19:10:41.923048   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:41.923367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:41.923430   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:42.422892   49088 type.go:168] "Request Body" body=""
	I1202 19:10:42.422960   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:42.423218   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:42.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:10:42.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:42.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:43.422967   49088 type.go:168] "Request Body" body=""
	I1202 19:10:43.423039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:43.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:43.922911   49088 type.go:168] "Request Body" body=""
	I1202 19:10:43.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:43.923286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:44.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:44.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:44.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:44.423441   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:44.923165   49088 type.go:168] "Request Body" body=""
	I1202 19:10:44.923245   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:44.923572   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:45.423613   49088 type.go:168] "Request Body" body=""
	I1202 19:10:45.423695   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:45.423958   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:45.924133   49088 type.go:168] "Request Body" body=""
	I1202 19:10:45.924208   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:45.924557   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:46.423281   49088 type.go:168] "Request Body" body=""
	I1202 19:10:46.423365   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:46.423700   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:46.423760   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:46.923435   49088 type.go:168] "Request Body" body=""
	I1202 19:10:46.923504   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:46.923772   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:47.423542   49088 type.go:168] "Request Body" body=""
	I1202 19:10:47.423616   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:47.423946   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:47.923718   49088 type.go:168] "Request Body" body=""
	I1202 19:10:47.923790   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:47.924128   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:48.423446   49088 type.go:168] "Request Body" body=""
	I1202 19:10:48.423517   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:48.423772   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:48.423814   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:48.923540   49088 type.go:168] "Request Body" body=""
	I1202 19:10:48.923616   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:48.923948   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:49.423625   49088 type.go:168] "Request Body" body=""
	I1202 19:10:49.423703   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:49.424044   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:49.923749   49088 type.go:168] "Request Body" body=""
	I1202 19:10:49.923817   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:49.924118   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:50.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:10:50.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:50.423386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:50.923086   49088 type.go:168] "Request Body" body=""
	I1202 19:10:50.923163   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:50.923498   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:50.923558   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:51.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:10:51.422947   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:51.423236   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:51.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:51.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:51.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:52.423002   49088 type.go:168] "Request Body" body=""
	I1202 19:10:52.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:52.423410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:52.922879   49088 type.go:168] "Request Body" body=""
	I1202 19:10:52.922948   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:52.923224   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:53.422960   49088 type.go:168] "Request Body" body=""
	I1202 19:10:53.423034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:53.423361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:53.423412   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:53.923077   49088 type.go:168] "Request Body" body=""
	I1202 19:10:53.923157   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:53.923495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:54.422917   49088 type.go:168] "Request Body" body=""
	I1202 19:10:54.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:54.423246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:54.923135   49088 type.go:168] "Request Body" body=""
	I1202 19:10:54.923208   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:54.923556   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:55.423559   49088 type.go:168] "Request Body" body=""
	I1202 19:10:55.423636   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:55.423971   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:55.424022   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:55.923605   49088 type.go:168] "Request Body" body=""
	I1202 19:10:55.923678   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:55.923946   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:56.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:10:56.423787   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:56.424129   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:56.923785   49088 type.go:168] "Request Body" body=""
	I1202 19:10:56.923856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:56.924173   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:57.423656   49088 type.go:168] "Request Body" body=""
	I1202 19:10:57.423734   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:57.423996   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:57.923693   49088 type.go:168] "Request Body" body=""
	I1202 19:10:57.923764   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:57.924078   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:57.924131   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:58.423895   49088 type.go:168] "Request Body" body=""
	I1202 19:10:58.423973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:58.424286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:58.922875   49088 type.go:168] "Request Body" body=""
	I1202 19:10:58.922950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:58.923200   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:59.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:59.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:59.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:59.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:59.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:59.923368   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:00.423287   49088 type.go:168] "Request Body" body=""
	I1202 19:11:00.423360   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:00.423657   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:00.423700   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:00.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:11:00.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:00.923393   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:01.423104   49088 type.go:168] "Request Body" body=""
	I1202 19:11:01.423186   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:01.423527   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:01.923028   49088 type.go:168] "Request Body" body=""
	I1202 19:11:01.923097   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:01.923355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:02.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:11:02.423036   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:02.423393   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:02.923093   49088 type.go:168] "Request Body" body=""
	I1202 19:11:02.923169   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:02.923483   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:02.923543   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:03.422923   49088 type.go:168] "Request Body" body=""
	I1202 19:11:03.422993   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:03.423275   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:03.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:03.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:03.923401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:04.423109   49088 type.go:168] "Request Body" body=""
	I1202 19:11:04.423183   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:04.423480   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:04.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:11:04.923029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:04.923291   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:05.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:11:05.423019   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:05.423416   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:05.423485   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:05.922975   49088 type.go:168] "Request Body" body=""
	I1202 19:11:05.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:05.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:06.423068   49088 type.go:168] "Request Body" body=""
	I1202 19:11:06.423143   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:06.423404   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:06.922954   49088 type.go:168] "Request Body" body=""
	I1202 19:11:06.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:06.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:07.423090   49088 type.go:168] "Request Body" body=""
	I1202 19:11:07.423173   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:07.423511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:07.423577   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:07.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:11:07.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:07.923477   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:08.423196   49088 type.go:168] "Request Body" body=""
	I1202 19:11:08.423268   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:08.423618   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:08.923317   49088 type.go:168] "Request Body" body=""
	I1202 19:11:08.923395   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:08.923714   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:09.423471   49088 type.go:168] "Request Body" body=""
	I1202 19:11:09.423536   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:09.423793   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:09.423831   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:09.923592   49088 type.go:168] "Request Body" body=""
	I1202 19:11:09.923672   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:09.923995   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:10.422883   49088 type.go:168] "Request Body" body=""
	I1202 19:11:10.422965   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:10.423316   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:10.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:10.923035   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:10.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:11.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:11:11.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:11.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:11.923095   49088 type.go:168] "Request Body" body=""
	I1202 19:11:11.923169   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:11.923521   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:11.923576   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:12.423858   49088 type.go:168] "Request Body" body=""
	I1202 19:11:12.423929   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:12.424221   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:12.922920   49088 type.go:168] "Request Body" body=""
	I1202 19:11:12.923007   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:12.923376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:13.423103   49088 type.go:168] "Request Body" body=""
	I1202 19:11:13.423187   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:13.423573   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:13.923844   49088 type.go:168] "Request Body" body=""
	I1202 19:11:13.923913   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:13.924261   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:13.924357   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:14.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:11:14.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:14.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:14.923176   49088 type.go:168] "Request Body" body=""
	I1202 19:11:14.923259   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:14.923607   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:15.423538   49088 type.go:168] "Request Body" body=""
	I1202 19:11:15.423610   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:15.423918   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:15.923744   49088 type.go:168] "Request Body" body=""
	I1202 19:11:15.923820   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:15.924119   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:16.423897   49088 type.go:168] "Request Body" body=""
	I1202 19:11:16.423968   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:16.424281   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:16.424354   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:16.922924   49088 type.go:168] "Request Body" body=""
	I1202 19:11:16.923006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:16.923265   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:17.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:11:17.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:17.423367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:17.922982   49088 type.go:168] "Request Body" body=""
	I1202 19:11:17.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:17.923356   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:18.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:11:18.422949   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:18.423201   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:18.922936   49088 type.go:168] "Request Body" body=""
	I1202 19:11:18.923013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:18.923346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:18.923404   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:19.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:11:19.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:19.423374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:19.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:11:19.923009   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:19.923288   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:20.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:11:20.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:20.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:20.923042   49088 type.go:168] "Request Body" body=""
	I1202 19:11:20.923120   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:20.923474   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:20.923528   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:21.423170   49088 type.go:168] "Request Body" body=""
	I1202 19:11:21.423246   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:21.423533   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:21.922987   49088 type.go:168] "Request Body" body=""
	I1202 19:11:21.923069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:21.923428   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:22.423136   49088 type.go:168] "Request Body" body=""
	I1202 19:11:22.423218   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:22.423580   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:22.923816   49088 type.go:168] "Request Body" body=""
	I1202 19:11:22.923890   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:22.924148   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:22.924186   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:23.422866   49088 type.go:168] "Request Body" body=""
	I1202 19:11:23.422936   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:23.423286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:23.923262   49088 type.go:168] "Request Body" body=""
	I1202 19:11:23.923352   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:23.923994   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:24.422942   49088 type.go:168] "Request Body" body=""
	I1202 19:11:24.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:24.423374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:24.922937   49088 type.go:168] "Request Body" body=""
	I1202 19:11:24.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:24.923429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:25.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:11:25.423008   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:25.423357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:25.423414   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:25.922926   49088 type.go:168] "Request Body" body=""
	I1202 19:11:25.922991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:25.923253   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:26.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:11:26.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:26.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:26.923094   49088 type.go:168] "Request Body" body=""
	I1202 19:11:26.923165   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:26.923469   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:27.422920   49088 type.go:168] "Request Body" body=""
	I1202 19:11:27.422991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:27.423278   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:27.922888   49088 type.go:168] "Request Body" body=""
	I1202 19:11:27.922961   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:27.923314   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:27.923368   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:28.422912   49088 type.go:168] "Request Body" body=""
	I1202 19:11:28.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:28.423375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:28.923772   49088 type.go:168] "Request Body" body=""
	I1202 19:11:28.923838   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:28.924083   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:29.423850   49088 type.go:168] "Request Body" body=""
	I1202 19:11:29.423927   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:29.424260   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:29.923896   49088 type.go:168] "Request Body" body=""
	I1202 19:11:29.923968   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:29.924284   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:29.924358   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:30.423879   49088 type.go:168] "Request Body" body=""
	I1202 19:11:30.423953   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:30.424220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:30.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:11:30.923004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:30.923350   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:31.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:11:31.423147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:31.423507   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:31.922953   49088 type.go:168] "Request Body" body=""
	I1202 19:11:31.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:31.923337   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:32.422971   49088 type.go:168] "Request Body" body=""
	I1202 19:11:32.423046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:32.423366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:32.423417   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:32.923034   49088 type.go:168] "Request Body" body=""
	I1202 19:11:32.923109   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:32.923438   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:33.423135   49088 type.go:168] "Request Body" body=""
	I1202 19:11:33.423204   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:33.423464   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:33.922974   49088 type.go:168] "Request Body" body=""
	I1202 19:11:33.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:33.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:34.423091   49088 type.go:168] "Request Body" body=""
	I1202 19:11:34.423168   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:34.423523   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:34.423580   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:34.923239   49088 type.go:168] "Request Body" body=""
	I1202 19:11:34.923307   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:34.923573   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:35.423667   49088 type.go:168] "Request Body" body=""
	I1202 19:11:35.423743   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:35.424088   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:35.923861   49088 type.go:168] "Request Body" body=""
	I1202 19:11:35.923938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:35.924296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:36.422936   49088 type.go:168] "Request Body" body=""
	I1202 19:11:36.423001   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:36.423257   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:36.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:11:36.923029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:36.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:36.923429   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:37.422938   49088 type.go:168] "Request Body" body=""
	I1202 19:11:37.423016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:37.423347   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:37.923043   49088 type.go:168] "Request Body" body=""
	I1202 19:11:37.923123   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:37.923400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:38.422941   49088 type.go:168] "Request Body" body=""
	I1202 19:11:38.423011   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:38.423321   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:38.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:11:38.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:38.923352   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:39.422867   49088 type.go:168] "Request Body" body=""
	I1202 19:11:39.422947   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:39.423239   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:39.423286   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:39.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:11:39.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:39.923373   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:40.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:11:40.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:40.423412   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:40.923647   49088 type.go:168] "Request Body" body=""
	I1202 19:11:40.923717   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:40.923978   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:41.423742   49088 type.go:168] "Request Body" body=""
	I1202 19:11:41.423821   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:41.424168   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:41.424221   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:41.922872   49088 type.go:168] "Request Body" body=""
	I1202 19:11:41.922945   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:41.923310   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:42.423001   49088 type.go:168] "Request Body" body=""
	I1202 19:11:42.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:42.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:42.922973   49088 type.go:168] "Request Body" body=""
	I1202 19:11:42.923054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:42.923395   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:43.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:11:43.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:43.423368   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:43.922915   49088 type.go:168] "Request Body" body=""
	I1202 19:11:43.922986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:43.923252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:43.923292   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:44.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:11:44.423058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:44.423399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:44.923181   49088 type.go:168] "Request Body" body=""
	I1202 19:11:44.923261   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:44.923585   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:45.423566   49088 type.go:168] "Request Body" body=""
	I1202 19:11:45.423644   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:45.423912   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:45.923624   49088 type.go:168] "Request Body" body=""
	I1202 19:11:45.923703   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:45.924023   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:45.924071   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:46.423663   49088 type.go:168] "Request Body" body=""
	I1202 19:11:46.423734   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:46.424056   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:46.923091   49088 type.go:168] "Request Body" body=""
	I1202 19:11:46.923157   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:46.923456   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:47.423153   49088 type.go:168] "Request Body" body=""
	I1202 19:11:47.423230   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:47.423569   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:47.923278   49088 type.go:168] "Request Body" body=""
	I1202 19:11:47.923362   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:47.923689   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:48.422896   49088 type.go:168] "Request Body" body=""
	I1202 19:11:48.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:48.423231   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:48.423281   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:48.922955   49088 type.go:168] "Request Body" body=""
	I1202 19:11:48.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:48.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:49.423094   49088 type.go:168] "Request Body" body=""
	I1202 19:11:49.423172   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:49.423529   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:49.923206   49088 type.go:168] "Request Body" body=""
	I1202 19:11:49.923275   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:49.923532   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:50.423633   49088 type.go:168] "Request Body" body=""
	I1202 19:11:50.423711   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:50.424022   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:50.424076   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:50.923805   49088 type.go:168] "Request Body" body=""
	I1202 19:11:50.923878   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:50.924219   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:51.423760   49088 type.go:168] "Request Body" body=""
	I1202 19:11:51.423833   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:51.424113   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:51.923870   49088 type.go:168] "Request Body" body=""
	I1202 19:11:51.923950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:51.924286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:52.422868   49088 type.go:168] "Request Body" body=""
	I1202 19:11:52.422952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:52.423285   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:52.923892   49088 type.go:168] "Request Body" body=""
	I1202 19:11:52.923962   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:52.924248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:52.924291   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:53.422923   49088 type.go:168] "Request Body" body=""
	I1202 19:11:53.423002   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:53.423357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:53.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:11:53.923041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:53.923384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:54.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:11:54.422991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:54.423296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:54.922927   49088 type.go:168] "Request Body" body=""
	I1202 19:11:54.923011   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:54.923324   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:55.423007   49088 type.go:168] "Request Body" body=""
	I1202 19:11:55.423084   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:55.423406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:55.423459   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:55.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:55.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:55.923318   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:56.423006   49088 type.go:168] "Request Body" body=""
	I1202 19:11:56.423078   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:56.423413   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:56.923138   49088 type.go:168] "Request Body" body=""
	I1202 19:11:56.923215   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:56.923566   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:57.423251   49088 type.go:168] "Request Body" body=""
	I1202 19:11:57.423331   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:57.423624   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:57.423682   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:57.923555   49088 type.go:168] "Request Body" body=""
	I1202 19:11:57.923629   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:57.923962   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:58.423744   49088 type.go:168] "Request Body" body=""
	I1202 19:11:58.423824   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:58.424157   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:58.923666   49088 type.go:168] "Request Body" body=""
	I1202 19:11:58.923737   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:58.924002   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:59.423779   49088 type.go:168] "Request Body" body=""
	I1202 19:11:59.423856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:59.424204   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:59.424262   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:59.922940   49088 type.go:168] "Request Body" body=""
	I1202 19:11:59.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:59.923350   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:00.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:12:00.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:00.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:00.923010   49088 type.go:168] "Request Body" body=""
	I1202 19:12:00.923091   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:00.923432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:01.423015   49088 type.go:168] "Request Body" body=""
	I1202 19:12:01.423098   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:01.423440   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:01.923901   49088 type.go:168] "Request Body" body=""
	I1202 19:12:01.923978   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:01.924301   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:01.924373   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:02.423047   49088 type.go:168] "Request Body" body=""
	I1202 19:12:02.423120   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:02.423479   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:02.923197   49088 type.go:168] "Request Body" body=""
	I1202 19:12:02.923275   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:02.923599   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:03.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:12:03.422989   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:03.423273   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:03.922947   49088 type.go:168] "Request Body" body=""
	I1202 19:12:03.923023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:03.923352   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:04.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:12:04.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:04.423399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:04.423455   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:04.923130   49088 type.go:168] "Request Body" body=""
	I1202 19:12:04.923196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:04.923453   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:05.423379   49088 type.go:168] "Request Body" body=""
	I1202 19:12:05.423457   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:05.423797   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:05.923568   49088 type.go:168] "Request Body" body=""
	I1202 19:12:05.923643   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:05.923966   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:06.423488   49088 type.go:168] "Request Body" body=""
	I1202 19:12:06.423556   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:06.423829   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:06.423875   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:06.923674   49088 type.go:168] "Request Body" body=""
	I1202 19:12:06.923754   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:06.924076   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:07.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:12:07.423954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:07.424301   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:07.923933   49088 type.go:168] "Request Body" body=""
	I1202 19:12:07.924013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:07.924280   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:08.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:12:08.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:08.423411   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:08.923117   49088 type.go:168] "Request Body" body=""
	I1202 19:12:08.923190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:08.923511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:08.923573   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:09.422927   49088 type.go:168] "Request Body" body=""
	I1202 19:12:09.422996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:09.423311   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:09.923024   49088 type.go:168] "Request Body" body=""
	I1202 19:12:09.923100   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:09.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:10.422995   49088 type.go:168] "Request Body" body=""
	I1202 19:12:10.423070   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:10.423406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:10.922922   49088 type.go:168] "Request Body" body=""
	I1202 19:12:10.922997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:10.923336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:11.422925   49088 type.go:168] "Request Body" body=""
	I1202 19:12:11.423002   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:11.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:11.423398   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:11.923073   49088 type.go:168] "Request Body" body=""
	I1202 19:12:11.923147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:11.923465   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:12.422920   49088 type.go:168] "Request Body" body=""
	I1202 19:12:12.422996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:12.423266   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:12.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:12:12.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:12.923408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:13.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:12:13.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:13.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:13.423447   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:13.923740   49088 type.go:168] "Request Body" body=""
	I1202 19:12:13.923807   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:13.924068   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:14.423836   49088 type.go:168] "Request Body" body=""
	I1202 19:12:14.423917   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:14.424266   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:14.922896   49088 type.go:168] "Request Body" body=""
	I1202 19:12:14.922969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:14.923318   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:15.423109   49088 type.go:168] "Request Body" body=""
	I1202 19:12:15.423181   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:15.423435   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:15.423486   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:15.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:12:15.923045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:15.923399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:16.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:12:16.423056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:16.423395   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:16.922950   49088 type.go:168] "Request Body" body=""
	I1202 19:12:16.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:16.923357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:17.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:12:17.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:17.423403   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:17.923097   49088 type.go:168] "Request Body" body=""
	I1202 19:12:17.923175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:17.923503   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:17.923560   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:18.423204   49088 type.go:168] "Request Body" body=""
	I1202 19:12:18.423269   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:18.423531   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:18.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:12:18.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:18.923383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:19.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:12:19.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:19.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:19.923083   49088 type.go:168] "Request Body" body=""
	I1202 19:12:19.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:19.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:20.423485   49088 type.go:168] "Request Body" body=""
	I1202 19:12:20.423567   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:20.423913   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:20.423967   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:20.923722   49088 type.go:168] "Request Body" body=""
	I1202 19:12:20.923795   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:20.924168   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:21.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:12:21.422998   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:21.423294   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:21.922979   49088 type.go:168] "Request Body" body=""
	I1202 19:12:21.923058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:21.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:22.422989   49088 type.go:168] "Request Body" body=""
	I1202 19:12:22.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:22.423414   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:22.923563   49088 type.go:168] "Request Body" body=""
	I1202 19:12:22.923637   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:22.923893   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:22.923932   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:23.423651   49088 type.go:168] "Request Body" body=""
	I1202 19:12:23.423730   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:23.424077   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:23.923725   49088 type.go:168] "Request Body" body=""
	I1202 19:12:23.923795   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:23.924130   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:24.423644   49088 type.go:168] "Request Body" body=""
	I1202 19:12:24.423724   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:24.424004   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:24.923060   49088 type.go:168] "Request Body" body=""
	I1202 19:12:24.923142   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:24.923487   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:25.423281   49088 type.go:168] "Request Body" body=""
	I1202 19:12:25.423364   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:25.423721   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:25.423779   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:25.923482   49088 type.go:168] "Request Body" body=""
	I1202 19:12:25.923550   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:25.923808   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:26.423542   49088 type.go:168] "Request Body" body=""
	I1202 19:12:26.423613   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:26.423935   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:26.923724   49088 type.go:168] "Request Body" body=""
	I1202 19:12:26.923807   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:26.924187   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:27.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:12:27.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:27.423252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:27.922940   49088 type.go:168] "Request Body" body=""
	I1202 19:12:27.923019   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:27.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:27.923403   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:28.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:12:28.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:28.423432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:28.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:12:28.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:28.923349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:29.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:12:29.423092   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:29.423421   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:29.923070   49088 type.go:168] "Request Body" body=""
	I1202 19:12:29.923147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:29.923486   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:29.923558   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:30.423578   49088 type.go:168] "Request Body" body=""
	I1202 19:12:30.423659   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:30.423950   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:30.923208   49088 type.go:168] "Request Body" body=""
	I1202 19:12:30.923281   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:30.923639   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:31.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:12:31.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:31.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:31.922934   49088 type.go:168] "Request Body" body=""
	I1202 19:12:31.923000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:31.923257   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:32.422953   49088 type.go:168] "Request Body" body=""
	I1202 19:12:32.423028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:32.423354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:32.423413   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:32.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:12:32.923168   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:32.923564   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:33.423262   49088 type.go:168] "Request Body" body=""
	I1202 19:12:33.423340   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:33.423598   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:33.922973   49088 type.go:168] "Request Body" body=""
	I1202 19:12:33.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:33.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:34.422949   49088 type.go:168] "Request Body" body=""
	I1202 19:12:34.423022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:34.423336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:34.922878   49088 type.go:168] "Request Body" body=""
	I1202 19:12:34.922943   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:34.923200   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:34.923239   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:35.423165   49088 type.go:168] "Request Body" body=""
	I1202 19:12:35.423238   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:35.423568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:35.923256   49088 type.go:168] "Request Body" body=""
	I1202 19:12:35.923329   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:35.923637   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:36.422898   49088 type.go:168] "Request Body" body=""
	I1202 19:12:36.422965   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:36.423218   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:36.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:12:36.923035   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:36.923355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:36.923409   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:37.423086   49088 type.go:168] "Request Body" body=""
	I1202 19:12:37.423161   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:37.423449   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:37.923834   49088 type.go:168] "Request Body" body=""
	I1202 19:12:37.923903   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:37.924159   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:38.422906   49088 type.go:168] "Request Body" body=""
	I1202 19:12:38.422977   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:38.423261   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:38.922970   49088 type.go:168] "Request Body" body=""
	I1202 19:12:38.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:38.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:38.923440   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:39.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:12:39.423788   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:39.424096   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:39.923879   49088 type.go:168] "Request Body" body=""
	I1202 19:12:39.923950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:39.924267   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:40.423003   49088 type.go:168] "Request Body" body=""
	I1202 19:12:40.423081   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:40.423385   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:40.923058   49088 type.go:168] "Request Body" body=""
	I1202 19:12:40.923129   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:40.923426   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:40.923486   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:41.422953   49088 type.go:168] "Request Body" body=""
	I1202 19:12:41.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:41.423343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:41.923078   49088 type.go:168] "Request Body" body=""
	I1202 19:12:41.923156   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:41.923473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:42.423139   49088 type.go:168] "Request Body" body=""
	I1202 19:12:42.423204   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:42.423502   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:42.923011   49088 type.go:168] "Request Body" body=""
	I1202 19:12:42.923090   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:42.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:43.423000   49088 type.go:168] "Request Body" body=""
	I1202 19:12:43.423079   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:43.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:43.423446   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:43.923623   49088 type.go:168] "Request Body" body=""
	I1202 19:12:43.923696   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:43.923958   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:44.423707   49088 type.go:168] "Request Body" body=""
	I1202 19:12:44.423805   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:44.424192   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:44.923041   49088 type.go:168] "Request Body" body=""
	I1202 19:12:44.923114   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:44.923460   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:45.423487   49088 type.go:168] "Request Body" body=""
	I1202 19:12:45.423555   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:45.423816   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:45.423856   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:45.923602   49088 type.go:168] "Request Body" body=""
	I1202 19:12:45.923678   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:45.924003   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:46.423772   49088 type.go:168] "Request Body" body=""
	I1202 19:12:46.423843   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:46.424193   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:46.923702   49088 type.go:168] "Request Body" body=""
	I1202 19:12:46.923775   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:46.924028   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:47.423742   49088 type.go:168] "Request Body" body=""
	I1202 19:12:47.423811   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:47.424126   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:47.424184   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:47.923682   49088 type.go:168] "Request Body" body=""
	I1202 19:12:47.923759   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:47.924070   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:48.423650   49088 type.go:168] "Request Body" body=""
	I1202 19:12:48.423719   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:48.423981   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:48.923704   49088 type.go:168] "Request Body" body=""
	I1202 19:12:48.923774   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:48.924090   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:49.423900   49088 type.go:168] "Request Body" body=""
	I1202 19:12:49.423983   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:49.424354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:49.424408   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:49.922950   49088 type.go:168] "Request Body" body=""
	I1202 19:12:49.923023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:49.923365   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:50.423243   49088 type.go:168] "Request Body" body=""
	I1202 19:12:50.423322   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:50.423653   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:50.922964   49088 type.go:168] "Request Body" body=""
	I1202 19:12:50.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:50.923377   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:51.422954   49088 type.go:168] "Request Body" body=""
	I1202 19:12:51.423023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:51.423346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:51.923047   49088 type.go:168] "Request Body" body=""
	I1202 19:12:51.923126   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:51.923460   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:51.923513   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:52.422937   49088 type.go:168] "Request Body" body=""
	I1202 19:12:52.423014   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:52.423303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:52.922929   49088 type.go:168] "Request Body" body=""
	I1202 19:12:52.923016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:52.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:53.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:12:53.423100   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:53.423482   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:53.922987   49088 type.go:168] "Request Body" body=""
	I1202 19:12:53.923061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:53.923413   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:54.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:12:54.422983   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:54.423246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:54.423296   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:54.923072   49088 type.go:168] "Request Body" body=""
	I1202 19:12:54.923153   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:54.923523   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:55.423518   49088 type.go:168] "Request Body" body=""
	I1202 19:12:55.423603   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:55.423968   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:55.923602   49088 type.go:168] "Request Body" body=""
	I1202 19:12:55.923672   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:55.924000   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:56.423806   49088 type.go:168] "Request Body" body=""
	I1202 19:12:56.423881   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:56.424245   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:56.424298   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:56.923893   49088 type.go:168] "Request Body" body=""
	I1202 19:12:56.923967   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:56.924355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:57.422930   49088 type.go:168] "Request Body" body=""
	I1202 19:12:57.422997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:57.423287   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:57.922958   49088 type.go:168] "Request Body" body=""
	I1202 19:12:57.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:57.923341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:58.423065   49088 type.go:168] "Request Body" body=""
	I1202 19:12:58.423137   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:58.423498   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:58.923185   49088 type.go:168] "Request Body" body=""
	I1202 19:12:58.923255   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:58.923518   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:58.923557   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:59.422988   49088 type.go:168] "Request Body" body=""
	I1202 19:12:59.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:59.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:59.923116   49088 type.go:168] "Request Body" body=""
	I1202 19:12:59.923196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:59.923537   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:00.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:13:00.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:00.423421   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:00.922957   49088 type.go:168] "Request Body" body=""
	I1202 19:13:00.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:00.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:01.423025   49088 type.go:168] "Request Body" body=""
	I1202 19:13:01.423098   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:01.423429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:01.423485   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:01.923135   49088 type.go:168] "Request Body" body=""
	I1202 19:13:01.923210   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:01.923470   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:02.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:13:02.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:02.423386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:02.923090   49088 type.go:168] "Request Body" body=""
	I1202 19:13:02.923194   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:02.923514   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:03.423202   49088 type.go:168] "Request Body" body=""
	I1202 19:13:03.423271   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:03.423580   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:03.423632   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:03.922949   49088 type.go:168] "Request Body" body=""
	I1202 19:13:03.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:03.923344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:04.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:13:04.423049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:04.423409   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:04.923130   49088 type.go:168] "Request Body" body=""
	I1202 19:13:04.923198   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:04.923485   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:05.423469   49088 type.go:168] "Request Body" body=""
	I1202 19:13:05.423540   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:05.423887   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:05.423943   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:05.923727   49088 type.go:168] "Request Body" body=""
	I1202 19:13:05.923801   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:05.924115   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:06.423862   49088 type.go:168] "Request Body" body=""
	I1202 19:13:06.423938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:06.424199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:06.922912   49088 type.go:168] "Request Body" body=""
	I1202 19:13:06.922984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:06.923325   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:07.422978   49088 type.go:168] "Request Body" body=""
	I1202 19:13:07.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:07.423373   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:07.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:13:07.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:07.923258   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:07.923298   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:08.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:13:08.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:08.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:08.922933   49088 type.go:168] "Request Body" body=""
	I1202 19:13:08.923006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:08.923329   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:09.422864   49088 type.go:168] "Request Body" body=""
	I1202 19:13:09.422931   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:09.423213   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:09.922936   49088 type.go:168] "Request Body" body=""
	I1202 19:13:09.923016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:09.923330   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:09.923387   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:10.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:13:10.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:10.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:10.922900   49088 type.go:168] "Request Body" body=""
	I1202 19:13:10.922972   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:10.923227   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:11.422967   49088 type.go:168] "Request Body" body=""
	I1202 19:13:11.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:11.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:11.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:13:11.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:11.923347   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:12.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:13:12.422981   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:12.423291   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:12.423344   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:12.922990   49088 type.go:168] "Request Body" body=""
	I1202 19:13:12.923081   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:12.923443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:13.423156   49088 type.go:168] "Request Body" body=""
	I1202 19:13:13.423234   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:13.423549   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:13.922900   49088 type.go:168] "Request Body" body=""
	I1202 19:13:13.922969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:13.923235   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:14.422957   49088 type.go:168] "Request Body" body=""
	I1202 19:13:14.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:14.423396   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:14.423449   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:14.923039   49088 type.go:168] "Request Body" body=""
	I1202 19:13:14.923111   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:14.923397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:15.423223   49088 type.go:168] "Request Body" body=""
	I1202 19:13:15.423302   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:15.423557   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:15.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:13:15.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:15.923367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:16.423062   49088 type.go:168] "Request Body" body=""
	I1202 19:13:16.423140   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:16.423473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:16.423529   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:16.923839   49088 type.go:168] "Request Body" body=""
	I1202 19:13:16.923917   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:16.924188   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:17.422894   49088 type.go:168] "Request Body" body=""
	I1202 19:13:17.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:17.423308   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:17.922909   49088 type.go:168] "Request Body" body=""
	I1202 19:13:17.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:17.923348   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:18.423042   49088 type.go:168] "Request Body" body=""
	I1202 19:13:18.423112   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:18.423378   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:18.923054   49088 type.go:168] "Request Body" body=""
	I1202 19:13:18.923129   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:18.923451   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:18.923507   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:19.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:13:19.423063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:19.423405   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:19.923563   49088 type.go:168] "Request Body" body=""
	I1202 19:13:19.923634   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:19.923942   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:20.423766   49088 type.go:168] "Request Body" body=""
	I1202 19:13:20.423836   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:20.424183   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:20.922889   49088 type.go:168] "Request Body" body=""
	I1202 19:13:20.922963   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:20.923286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:21.422910   49088 type.go:168] "Request Body" body=""
	I1202 19:13:21.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:21.423265   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:21.423304   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:21.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:13:21.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:21.923372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:22.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:13:22.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:22.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:22.923660   49088 type.go:168] "Request Body" body=""
	I1202 19:13:22.923726   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:22.923989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:23.423852   49088 type.go:168] "Request Body" body=""
	I1202 19:13:23.423930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:23.424355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:23.424410   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:23.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:13:23.923088   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:23.923457   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:24.423150   49088 type.go:168] "Request Body" body=""
	I1202 19:13:24.423233   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:24.423566   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:24.923507   49088 type.go:168] "Request Body" body=""
	I1202 19:13:24.923591   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:24.923948   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:25.423717   49088 type.go:168] "Request Body" body=""
	I1202 19:13:25.423797   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:25.424161   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:25.922841   49088 node_ready.go:38] duration metric: took 6m0.000085627s for node "functional-449836" to be "Ready" ...
	I1202 19:13:25.925875   49088 out.go:203] 
	W1202 19:13:25.928738   49088 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1202 19:13:25.928760   49088 out.go:285] * 
	* 
	W1202 19:13:25.930899   49088 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 19:13:25.934748   49088 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:676: failed to soft start minikube. args "out/minikube-linux-arm64 start -p functional-449836 --alsologtostderr -v=8": exit status 80
functional_test.go:678: soft start took 6m6.63375998s for "functional-449836" cluster.
I1202 19:13:26.516484    4435 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (426.995898ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-224594 ssh sudo cat /etc/ssl/certs/44352.pem                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image load --daemon kicbase/echo-server:functional-224594 --alsologtostderr                                                                   │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh sudo cat /usr/share/ca-certificates/44352.pem                                                                                             │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh sudo cat /etc/test/nested/copy/4435/hosts                                                                                                 │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image save kicbase/echo-server:functional-224594 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image rm kicbase/echo-server:functional-224594 --alsologtostderr                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ update-context │ functional-224594 update-context --alsologtostderr -v=2                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image save --daemon kicbase/echo-server:functional-224594 --alsologtostderr                                                                   │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ update-context │ functional-224594 update-context --alsologtostderr -v=2                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ update-context │ functional-224594 update-context --alsologtostderr -v=2                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls --format short --alsologtostderr                                                                                                     │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls --format yaml --alsologtostderr                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh pgrep buildkitd                                                                                                                           │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ image          │ functional-224594 image ls --format json --alsologtostderr                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls --format table --alsologtostderr                                                                                                     │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image build -t localhost/my-image:functional-224594 testdata/build --alsologtostderr                                                          │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ delete         │ -p functional-224594                                                                                                                                            │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ start          │ -p functional-449836 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0         │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ start          │ -p functional-449836 --alsologtostderr -v=8                                                                                                                     │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:07 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:07:19
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:07:19.929855   49088 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:07:19.930082   49088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:07:19.930109   49088 out.go:374] Setting ErrFile to fd 2...
	I1202 19:07:19.930127   49088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:07:19.930424   49088 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:07:19.930829   49088 out.go:368] Setting JSON to false
	I1202 19:07:19.931678   49088 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":2976,"bootTime":1764699464,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:07:19.931776   49088 start.go:143] virtualization:  
	I1202 19:07:19.935245   49088 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:07:19.939094   49088 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:07:19.939188   49088 notify.go:221] Checking for updates...
	I1202 19:07:19.944799   49088 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:07:19.947646   49088 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:19.950501   49088 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:07:19.953361   49088 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:07:19.956281   49088 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:07:19.959695   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:19.959887   49088 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:07:19.996438   49088 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:07:19.996577   49088 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:07:20.063124   49088 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:07:20.053388152 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:07:20.063232   49088 docker.go:319] overlay module found
	I1202 19:07:20.066390   49088 out.go:179] * Using the docker driver based on existing profile
	I1202 19:07:20.069271   49088 start.go:309] selected driver: docker
	I1202 19:07:20.069311   49088 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:20.069422   49088 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:07:20.069541   49088 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:07:20.132012   49088 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:07:20.122627931 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:07:20.132615   49088 cni.go:84] Creating CNI manager for ""
	I1202 19:07:20.132692   49088 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:07:20.132751   49088 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:20.135845   49088 out.go:179] * Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	I1202 19:07:20.138639   49088 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 19:07:20.141498   49088 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 19:07:20.144479   49088 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 19:07:20.144604   49088 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:07:20.163347   49088 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 19:07:20.163372   49088 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 19:07:20.218193   49088 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 19:07:20.422833   49088 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 19:07:20.423042   49088 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 19:07:20.423128   49088 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423219   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 19:07:20.423234   49088 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 122.125µs
	I1202 19:07:20.423249   49088 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 19:07:20.423267   49088 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423303   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 19:07:20.423312   49088 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 47.557µs
	I1202 19:07:20.423318   49088 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423331   49088 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423365   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 19:07:20.423374   49088 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 44.415µs
	I1202 19:07:20.423380   49088 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423395   49088 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423422   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 19:07:20.423432   49088 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 37.579µs
	I1202 19:07:20.423438   49088 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423447   49088 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423476   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 19:07:20.423484   49088 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 38.933µs
	I1202 19:07:20.423490   49088 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423510   49088 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423540   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 19:07:20.423549   49088 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 40.796µs
	I1202 19:07:20.423555   49088 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 19:07:20.423569   49088 cache.go:243] Successfully downloaded all kic artifacts
	I1202 19:07:20.423588   49088 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423620   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 19:07:20.423629   49088 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 42.487µs
	I1202 19:07:20.423635   49088 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 19:07:20.423646   49088 start.go:360] acquireMachinesLock for functional-449836: {Name:mk8999fdfa518fc15358d07431fe9bec286a035e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423706   49088 start.go:364] duration metric: took 31.868µs to acquireMachinesLock for "functional-449836"
	I1202 19:07:20.423570   49088 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423753   49088 start.go:96] Skipping create...Using existing machine configuration
	I1202 19:07:20.423783   49088 fix.go:54] fixHost starting: 
	I1202 19:07:20.423759   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 19:07:20.423888   49088 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 323.2µs
	I1202 19:07:20.423896   49088 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 19:07:20.423906   49088 cache.go:87] Successfully saved all images to host disk.
	I1202 19:07:20.424111   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:20.441213   49088 fix.go:112] recreateIfNeeded on functional-449836: state=Running err=<nil>
	W1202 19:07:20.441244   49088 fix.go:138] unexpected machine state, will restart: <nil>
	I1202 19:07:20.444707   49088 out.go:252] * Updating the running docker "functional-449836" container ...
	I1202 19:07:20.444749   49088 machine.go:94] provisionDockerMachine start ...
	I1202 19:07:20.444842   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.461943   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.462269   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.462284   49088 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 19:07:20.612055   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:07:20.612125   49088 ubuntu.go:182] provisioning hostname "functional-449836"
	I1202 19:07:20.612222   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.629856   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.630166   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.630180   49088 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-449836 && echo "functional-449836" | sudo tee /etc/hostname
	I1202 19:07:20.793419   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:07:20.793536   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.812441   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.812754   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.812775   49088 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-449836' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-449836/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-449836' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 19:07:20.961443   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 19:07:20.961480   49088 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 19:07:20.961539   49088 ubuntu.go:190] setting up certificates
	I1202 19:07:20.961556   49088 provision.go:84] configureAuth start
	I1202 19:07:20.961634   49088 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:07:20.990731   49088 provision.go:143] copyHostCerts
	I1202 19:07:20.990790   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:07:20.990838   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 19:07:20.990856   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:07:20.990938   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 19:07:20.991037   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:07:20.991060   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 19:07:20.991069   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:07:20.991098   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 19:07:20.991189   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:07:20.991211   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 19:07:20.991220   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:07:20.991247   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 19:07:20.991297   49088 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.functional-449836 san=[127.0.0.1 192.168.49.2 functional-449836 localhost minikube]
	I1202 19:07:21.335552   49088 provision.go:177] copyRemoteCerts
	I1202 19:07:21.335618   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 19:07:21.335658   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.354079   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.460475   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1202 19:07:21.460535   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1202 19:07:21.478965   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1202 19:07:21.479028   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 19:07:21.497363   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1202 19:07:21.497471   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 19:07:21.514946   49088 provision.go:87] duration metric: took 553.36724ms to configureAuth
	I1202 19:07:21.515020   49088 ubuntu.go:206] setting minikube options for container-runtime
	I1202 19:07:21.515215   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:21.515248   49088 machine.go:97] duration metric: took 1.070490831s to provisionDockerMachine
	I1202 19:07:21.515264   49088 start.go:293] postStartSetup for "functional-449836" (driver="docker")
	I1202 19:07:21.515276   49088 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 19:07:21.515329   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 19:07:21.515382   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.532644   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.636416   49088 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 19:07:21.639685   49088 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1202 19:07:21.639756   49088 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1202 19:07:21.639777   49088 command_runner.go:130] > VERSION_ID="12"
	I1202 19:07:21.639798   49088 command_runner.go:130] > VERSION="12 (bookworm)"
	I1202 19:07:21.639827   49088 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1202 19:07:21.639832   49088 command_runner.go:130] > ID=debian
	I1202 19:07:21.639847   49088 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1202 19:07:21.639859   49088 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1202 19:07:21.639866   49088 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1202 19:07:21.639943   49088 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 19:07:21.639962   49088 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 19:07:21.639974   49088 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 19:07:21.640036   49088 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 19:07:21.640112   49088 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 19:07:21.640123   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> /etc/ssl/certs/44352.pem
	I1202 19:07:21.640204   49088 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> hosts in /etc/test/nested/copy/4435
	I1202 19:07:21.640213   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> /etc/test/nested/copy/4435/hosts
	I1202 19:07:21.640263   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4435
	I1202 19:07:21.647807   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:07:21.664872   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts --> /etc/test/nested/copy/4435/hosts (40 bytes)
	I1202 19:07:21.686465   49088 start.go:296] duration metric: took 171.184702ms for postStartSetup
	I1202 19:07:21.686545   49088 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:07:21.686646   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.708068   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.808826   49088 command_runner.go:130] > 18%
	I1202 19:07:21.809461   49088 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 19:07:21.814183   49088 command_runner.go:130] > 159G
	I1202 19:07:21.814719   49088 fix.go:56] duration metric: took 1.390932828s for fixHost
	I1202 19:07:21.814741   49088 start.go:83] releasing machines lock for "functional-449836", held for 1.391011327s
	I1202 19:07:21.814809   49088 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:07:21.831833   49088 ssh_runner.go:195] Run: cat /version.json
	I1202 19:07:21.831895   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.832169   49088 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 19:07:21.832229   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.852617   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.855772   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.955939   49088 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1764169655-21974", "minikube_version": "v1.37.0", "commit": "5499406178e21d60d74d327c9716de794e8a4797"}
	I1202 19:07:21.956090   49088 ssh_runner.go:195] Run: systemctl --version
	I1202 19:07:22.048548   49088 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1202 19:07:22.051368   49088 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1202 19:07:22.051402   49088 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1202 19:07:22.051488   49088 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1202 19:07:22.055900   49088 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1202 19:07:22.056072   49088 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 19:07:22.056144   49088 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 19:07:22.064483   49088 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1202 19:07:22.064507   49088 start.go:496] detecting cgroup driver to use...
	I1202 19:07:22.064540   49088 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 19:07:22.064608   49088 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 19:07:22.080944   49088 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 19:07:22.094328   49088 docker.go:218] disabling cri-docker service (if available) ...
	I1202 19:07:22.094412   49088 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 19:07:22.110538   49088 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 19:07:22.123916   49088 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 19:07:22.251555   49088 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 19:07:22.372403   49088 docker.go:234] disabling docker service ...
	I1202 19:07:22.372547   49088 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 19:07:22.390362   49088 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 19:07:22.404129   49088 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 19:07:22.527674   49088 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 19:07:22.641245   49088 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 19:07:22.654510   49088 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 19:07:22.669149   49088 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1202 19:07:22.670616   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 19:07:22.680782   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 19:07:22.690619   49088 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 19:07:22.690690   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 19:07:22.700650   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:07:22.710637   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 19:07:22.720237   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:07:22.730375   49088 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 19:07:22.738458   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 19:07:22.747256   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 19:07:22.756269   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 19:07:22.765824   49088 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 19:07:22.772632   49088 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1202 19:07:22.773683   49088 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 19:07:22.781384   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:22.894036   49088 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 19:07:22.996092   49088 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 19:07:22.996190   49088 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 19:07:23.000049   49088 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1202 19:07:23.000075   49088 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1202 19:07:23.000083   49088 command_runner.go:130] > Device: 0,72	Inode: 1611        Links: 1
	I1202 19:07:23.000090   49088 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1202 19:07:23.000119   49088 command_runner.go:130] > Access: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000134   49088 command_runner.go:130] > Modify: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000139   49088 command_runner.go:130] > Change: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000143   49088 command_runner.go:130] >  Birth: -
	I1202 19:07:23.000708   49088 start.go:564] Will wait 60s for crictl version
	I1202 19:07:23.000798   49088 ssh_runner.go:195] Run: which crictl
	I1202 19:07:23.004553   49088 command_runner.go:130] > /usr/local/bin/crictl
	I1202 19:07:23.004698   49088 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 19:07:23.031006   49088 command_runner.go:130] > Version:  0.1.0
	I1202 19:07:23.031142   49088 command_runner.go:130] > RuntimeName:  containerd
	I1202 19:07:23.031156   49088 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1202 19:07:23.031165   49088 command_runner.go:130] > RuntimeApiVersion:  v1
	I1202 19:07:23.033497   49088 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 19:07:23.033588   49088 ssh_runner.go:195] Run: containerd --version
	I1202 19:07:23.053512   49088 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1202 19:07:23.055064   49088 ssh_runner.go:195] Run: containerd --version
	I1202 19:07:23.073280   49088 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1202 19:07:23.080684   49088 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 19:07:23.083736   49088 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 19:07:23.100485   49088 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1202 19:07:23.104603   49088 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1202 19:07:23.104709   49088 kubeadm.go:884] updating cluster {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 19:07:23.104831   49088 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:07:23.104890   49088 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 19:07:23.127690   49088 command_runner.go:130] > {
	I1202 19:07:23.127710   49088 command_runner.go:130] >   "images":  [
	I1202 19:07:23.127715   49088 command_runner.go:130] >     {
	I1202 19:07:23.127725   49088 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1202 19:07:23.127729   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127744   49088 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1202 19:07:23.127750   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127755   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127759   49088 command_runner.go:130] >       "size":  "8032639",
	I1202 19:07:23.127765   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127776   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127781   49088 command_runner.go:130] >     },
	I1202 19:07:23.127784   49088 command_runner.go:130] >     {
	I1202 19:07:23.127792   49088 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1202 19:07:23.127800   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127806   49088 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1202 19:07:23.127813   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127817   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127822   49088 command_runner.go:130] >       "size":  "21166088",
	I1202 19:07:23.127826   49088 command_runner.go:130] >       "username":  "nonroot",
	I1202 19:07:23.127832   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127835   49088 command_runner.go:130] >     },
	I1202 19:07:23.127838   49088 command_runner.go:130] >     {
	I1202 19:07:23.127845   49088 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1202 19:07:23.127855   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127869   49088 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1202 19:07:23.127876   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127880   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127887   49088 command_runner.go:130] >       "size":  "21134420",
	I1202 19:07:23.127892   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.127899   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.127903   49088 command_runner.go:130] >       },
	I1202 19:07:23.127907   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127911   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127917   49088 command_runner.go:130] >     },
	I1202 19:07:23.127919   49088 command_runner.go:130] >     {
	I1202 19:07:23.127926   49088 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1202 19:07:23.127930   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127938   49088 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1202 19:07:23.127945   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127949   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127953   49088 command_runner.go:130] >       "size":  "24676285",
	I1202 19:07:23.127961   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.127965   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.127971   49088 command_runner.go:130] >       },
	I1202 19:07:23.127975   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127983   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127987   49088 command_runner.go:130] >     },
	I1202 19:07:23.127996   49088 command_runner.go:130] >     {
	I1202 19:07:23.128002   49088 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1202 19:07:23.128006   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128012   49088 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1202 19:07:23.128015   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128019   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128026   49088 command_runner.go:130] >       "size":  "20658969",
	I1202 19:07:23.128029   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128033   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.128041   49088 command_runner.go:130] >       },
	I1202 19:07:23.128052   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128059   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128063   49088 command_runner.go:130] >     },
	I1202 19:07:23.128070   49088 command_runner.go:130] >     {
	I1202 19:07:23.128077   49088 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1202 19:07:23.128081   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128088   49088 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1202 19:07:23.128092   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128096   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128099   49088 command_runner.go:130] >       "size":  "22428165",
	I1202 19:07:23.128103   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128109   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128113   49088 command_runner.go:130] >     },
	I1202 19:07:23.128116   49088 command_runner.go:130] >     {
	I1202 19:07:23.128123   49088 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1202 19:07:23.128130   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128135   49088 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1202 19:07:23.128143   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128152   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128160   49088 command_runner.go:130] >       "size":  "15389290",
	I1202 19:07:23.128163   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128167   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.128170   49088 command_runner.go:130] >       },
	I1202 19:07:23.128175   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128179   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128185   49088 command_runner.go:130] >     },
	I1202 19:07:23.128188   49088 command_runner.go:130] >     {
	I1202 19:07:23.128199   49088 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1202 19:07:23.128203   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128212   49088 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1202 19:07:23.128215   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128223   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128227   49088 command_runner.go:130] >       "size":  "265458",
	I1202 19:07:23.128238   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128243   49088 command_runner.go:130] >         "value":  "65535"
	I1202 19:07:23.128248   49088 command_runner.go:130] >       },
	I1202 19:07:23.128252   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128256   49088 command_runner.go:130] >       "pinned":  true
	I1202 19:07:23.128259   49088 command_runner.go:130] >     }
	I1202 19:07:23.128262   49088 command_runner.go:130] >   ]
	I1202 19:07:23.128265   49088 command_runner.go:130] > }
	I1202 19:07:23.130379   49088 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 19:07:23.130403   49088 cache_images.go:86] Images are preloaded, skipping loading
	I1202 19:07:23.130410   49088 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1202 19:07:23.130509   49088 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-449836 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 19:07:23.130576   49088 ssh_runner.go:195] Run: sudo crictl info
	I1202 19:07:23.152707   49088 command_runner.go:130] > {
	I1202 19:07:23.152731   49088 command_runner.go:130] >   "cniconfig": {
	I1202 19:07:23.152737   49088 command_runner.go:130] >     "Networks": [
	I1202 19:07:23.152741   49088 command_runner.go:130] >       {
	I1202 19:07:23.152746   49088 command_runner.go:130] >         "Config": {
	I1202 19:07:23.152752   49088 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1202 19:07:23.152758   49088 command_runner.go:130] >           "Name": "cni-loopback",
	I1202 19:07:23.152768   49088 command_runner.go:130] >           "Plugins": [
	I1202 19:07:23.152775   49088 command_runner.go:130] >             {
	I1202 19:07:23.152779   49088 command_runner.go:130] >               "Network": {
	I1202 19:07:23.152784   49088 command_runner.go:130] >                 "ipam": {},
	I1202 19:07:23.152789   49088 command_runner.go:130] >                 "type": "loopback"
	I1202 19:07:23.152798   49088 command_runner.go:130] >               },
	I1202 19:07:23.152803   49088 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1202 19:07:23.152810   49088 command_runner.go:130] >             }
	I1202 19:07:23.152814   49088 command_runner.go:130] >           ],
	I1202 19:07:23.152828   49088 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1202 19:07:23.152835   49088 command_runner.go:130] >         },
	I1202 19:07:23.152840   49088 command_runner.go:130] >         "IFName": "lo"
	I1202 19:07:23.152847   49088 command_runner.go:130] >       }
	I1202 19:07:23.152850   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152855   49088 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1202 19:07:23.152860   49088 command_runner.go:130] >     "PluginDirs": [
	I1202 19:07:23.152865   49088 command_runner.go:130] >       "/opt/cni/bin"
	I1202 19:07:23.152869   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152873   49088 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1202 19:07:23.152879   49088 command_runner.go:130] >     "Prefix": "eth"
	I1202 19:07:23.152883   49088 command_runner.go:130] >   },
	I1202 19:07:23.152891   49088 command_runner.go:130] >   "config": {
	I1202 19:07:23.152894   49088 command_runner.go:130] >     "cdiSpecDirs": [
	I1202 19:07:23.152898   49088 command_runner.go:130] >       "/etc/cdi",
	I1202 19:07:23.152907   49088 command_runner.go:130] >       "/var/run/cdi"
	I1202 19:07:23.152910   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152917   49088 command_runner.go:130] >     "cni": {
	I1202 19:07:23.152921   49088 command_runner.go:130] >       "binDir": "",
	I1202 19:07:23.152928   49088 command_runner.go:130] >       "binDirs": [
	I1202 19:07:23.152933   49088 command_runner.go:130] >         "/opt/cni/bin"
	I1202 19:07:23.152936   49088 command_runner.go:130] >       ],
	I1202 19:07:23.152941   49088 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1202 19:07:23.152947   49088 command_runner.go:130] >       "confTemplate": "",
	I1202 19:07:23.152954   49088 command_runner.go:130] >       "ipPref": "",
	I1202 19:07:23.152958   49088 command_runner.go:130] >       "maxConfNum": 1,
	I1202 19:07:23.152963   49088 command_runner.go:130] >       "setupSerially": false,
	I1202 19:07:23.152969   49088 command_runner.go:130] >       "useInternalLoopback": false
	I1202 19:07:23.152977   49088 command_runner.go:130] >     },
	I1202 19:07:23.152983   49088 command_runner.go:130] >     "containerd": {
	I1202 19:07:23.152992   49088 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1202 19:07:23.152997   49088 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1202 19:07:23.153006   49088 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1202 19:07:23.153010   49088 command_runner.go:130] >       "runtimes": {
	I1202 19:07:23.153017   49088 command_runner.go:130] >         "runc": {
	I1202 19:07:23.153022   49088 command_runner.go:130] >           "ContainerAnnotations": null,
	I1202 19:07:23.153026   49088 command_runner.go:130] >           "PodAnnotations": null,
	I1202 19:07:23.153031   49088 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1202 19:07:23.153035   49088 command_runner.go:130] >           "cgroupWritable": false,
	I1202 19:07:23.153041   49088 command_runner.go:130] >           "cniConfDir": "",
	I1202 19:07:23.153046   49088 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1202 19:07:23.153053   49088 command_runner.go:130] >           "io_type": "",
	I1202 19:07:23.153058   49088 command_runner.go:130] >           "options": {
	I1202 19:07:23.153066   49088 command_runner.go:130] >             "BinaryName": "",
	I1202 19:07:23.153071   49088 command_runner.go:130] >             "CriuImagePath": "",
	I1202 19:07:23.153079   49088 command_runner.go:130] >             "CriuWorkPath": "",
	I1202 19:07:23.153083   49088 command_runner.go:130] >             "IoGid": 0,
	I1202 19:07:23.153091   49088 command_runner.go:130] >             "IoUid": 0,
	I1202 19:07:23.153096   49088 command_runner.go:130] >             "NoNewKeyring": false,
	I1202 19:07:23.153100   49088 command_runner.go:130] >             "Root": "",
	I1202 19:07:23.153104   49088 command_runner.go:130] >             "ShimCgroup": "",
	I1202 19:07:23.153111   49088 command_runner.go:130] >             "SystemdCgroup": false
	I1202 19:07:23.153115   49088 command_runner.go:130] >           },
	I1202 19:07:23.153120   49088 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1202 19:07:23.153128   49088 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1202 19:07:23.153136   49088 command_runner.go:130] >           "runtimePath": "",
	I1202 19:07:23.153143   49088 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1202 19:07:23.153237   49088 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1202 19:07:23.153375   49088 command_runner.go:130] >           "snapshotter": ""
	I1202 19:07:23.153385   49088 command_runner.go:130] >         }
	I1202 19:07:23.153389   49088 command_runner.go:130] >       }
	I1202 19:07:23.153393   49088 command_runner.go:130] >     },
	I1202 19:07:23.153414   49088 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1202 19:07:23.153424   49088 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1202 19:07:23.153435   49088 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1202 19:07:23.153444   49088 command_runner.go:130] >     "disableApparmor": false,
	I1202 19:07:23.153449   49088 command_runner.go:130] >     "disableHugetlbController": true,
	I1202 19:07:23.153457   49088 command_runner.go:130] >     "disableProcMount": false,
	I1202 19:07:23.153467   49088 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1202 19:07:23.153475   49088 command_runner.go:130] >     "enableCDI": true,
	I1202 19:07:23.153479   49088 command_runner.go:130] >     "enableSelinux": false,
	I1202 19:07:23.153484   49088 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1202 19:07:23.153490   49088 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1202 19:07:23.153500   49088 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1202 19:07:23.153508   49088 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1202 19:07:23.153516   49088 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1202 19:07:23.153522   49088 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1202 19:07:23.153534   49088 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1202 19:07:23.153544   49088 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1202 19:07:23.153549   49088 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1202 19:07:23.153562   49088 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1202 19:07:23.153570   49088 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1202 19:07:23.153575   49088 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1202 19:07:23.153578   49088 command_runner.go:130] >   },
	I1202 19:07:23.153582   49088 command_runner.go:130] >   "features": {
	I1202 19:07:23.153588   49088 command_runner.go:130] >     "supplemental_groups_policy": true
	I1202 19:07:23.153597   49088 command_runner.go:130] >   },
	I1202 19:07:23.153605   49088 command_runner.go:130] >   "golang": "go1.24.9",
	I1202 19:07:23.153615   49088 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1202 19:07:23.153633   49088 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1202 19:07:23.153644   49088 command_runner.go:130] >   "runtimeHandlers": [
	I1202 19:07:23.153649   49088 command_runner.go:130] >     {
	I1202 19:07:23.153658   49088 command_runner.go:130] >       "features": {
	I1202 19:07:23.153664   49088 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1202 19:07:23.153669   49088 command_runner.go:130] >         "user_namespaces": true
	I1202 19:07:23.153675   49088 command_runner.go:130] >       }
	I1202 19:07:23.153679   49088 command_runner.go:130] >     },
	I1202 19:07:23.153686   49088 command_runner.go:130] >     {
	I1202 19:07:23.153691   49088 command_runner.go:130] >       "features": {
	I1202 19:07:23.153703   49088 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1202 19:07:23.153708   49088 command_runner.go:130] >         "user_namespaces": true
	I1202 19:07:23.153715   49088 command_runner.go:130] >       },
	I1202 19:07:23.153720   49088 command_runner.go:130] >       "name": "runc"
	I1202 19:07:23.153727   49088 command_runner.go:130] >     }
	I1202 19:07:23.153731   49088 command_runner.go:130] >   ],
	I1202 19:07:23.153738   49088 command_runner.go:130] >   "status": {
	I1202 19:07:23.153742   49088 command_runner.go:130] >     "conditions": [
	I1202 19:07:23.153746   49088 command_runner.go:130] >       {
	I1202 19:07:23.153751   49088 command_runner.go:130] >         "message": "",
	I1202 19:07:23.153757   49088 command_runner.go:130] >         "reason": "",
	I1202 19:07:23.153766   49088 command_runner.go:130] >         "status": true,
	I1202 19:07:23.153774   49088 command_runner.go:130] >         "type": "RuntimeReady"
	I1202 19:07:23.153781   49088 command_runner.go:130] >       },
	I1202 19:07:23.153785   49088 command_runner.go:130] >       {
	I1202 19:07:23.153792   49088 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1202 19:07:23.153797   49088 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1202 19:07:23.153805   49088 command_runner.go:130] >         "status": false,
	I1202 19:07:23.153810   49088 command_runner.go:130] >         "type": "NetworkReady"
	I1202 19:07:23.153814   49088 command_runner.go:130] >       },
	I1202 19:07:23.153820   49088 command_runner.go:130] >       {
	I1202 19:07:23.153824   49088 command_runner.go:130] >         "message": "",
	I1202 19:07:23.153828   49088 command_runner.go:130] >         "reason": "",
	I1202 19:07:23.153836   49088 command_runner.go:130] >         "status": true,
	I1202 19:07:23.153850   49088 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1202 19:07:23.153857   49088 command_runner.go:130] >       }
	I1202 19:07:23.153861   49088 command_runner.go:130] >     ]
	I1202 19:07:23.153868   49088 command_runner.go:130] >   }
	I1202 19:07:23.153871   49088 command_runner.go:130] > }
	I1202 19:07:23.157283   49088 cni.go:84] Creating CNI manager for ""
	I1202 19:07:23.157307   49088 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:07:23.157324   49088 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 19:07:23.157352   49088 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-449836 NodeName:functional-449836 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 19:07:23.157503   49088 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-449836"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 19:07:23.157589   49088 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 19:07:23.165274   49088 command_runner.go:130] > kubeadm
	I1202 19:07:23.165296   49088 command_runner.go:130] > kubectl
	I1202 19:07:23.165301   49088 command_runner.go:130] > kubelet
	I1202 19:07:23.166244   49088 binaries.go:51] Found k8s binaries, skipping transfer
	I1202 19:07:23.166309   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 19:07:23.176520   49088 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1202 19:07:23.191534   49088 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 19:07:23.207596   49088 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1202 19:07:23.221899   49088 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1202 19:07:23.225538   49088 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1202 19:07:23.225972   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:23.344071   49088 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:07:24.171449   49088 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836 for IP: 192.168.49.2
	I1202 19:07:24.171473   49088 certs.go:195] generating shared ca certs ...
	I1202 19:07:24.171491   49088 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.171633   49088 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 19:07:24.171683   49088 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 19:07:24.171697   49088 certs.go:257] generating profile certs ...
	I1202 19:07:24.171794   49088 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key
	I1202 19:07:24.171860   49088 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da
	I1202 19:07:24.171905   49088 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key
	I1202 19:07:24.171916   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1202 19:07:24.171929   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1202 19:07:24.171946   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1202 19:07:24.171957   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1202 19:07:24.171972   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1202 19:07:24.171985   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1202 19:07:24.172001   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1202 19:07:24.172012   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1202 19:07:24.172062   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 19:07:24.172113   49088 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 19:07:24.172126   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 19:07:24.172154   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 19:07:24.172189   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 19:07:24.172215   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 19:07:24.172266   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:07:24.172298   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.172314   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.172347   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem -> /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.172878   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 19:07:24.192840   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 19:07:24.210709   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 19:07:24.228270   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 19:07:24.246519   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1202 19:07:24.264649   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 19:07:24.283289   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 19:07:24.302316   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 19:07:24.320907   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 19:07:24.338895   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 19:07:24.356995   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 19:07:24.374784   49088 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 19:07:24.388173   49088 ssh_runner.go:195] Run: openssl version
	I1202 19:07:24.394457   49088 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1202 19:07:24.394840   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 19:07:24.403512   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407229   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407385   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407455   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.448501   49088 command_runner.go:130] > 3ec20f2e
	I1202 19:07:24.448942   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 19:07:24.456981   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 19:07:24.465478   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469306   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469374   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469438   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.510270   49088 command_runner.go:130] > b5213941
	I1202 19:07:24.510784   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 19:07:24.518790   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 19:07:24.527001   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.530919   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.530959   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.531008   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.571727   49088 command_runner.go:130] > 51391683
	I1202 19:07:24.572161   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 19:07:24.580157   49088 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:07:24.584062   49088 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:07:24.584087   49088 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1202 19:07:24.584094   49088 command_runner.go:130] > Device: 259,1	Inode: 848916      Links: 1
	I1202 19:07:24.584101   49088 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1202 19:07:24.584108   49088 command_runner.go:130] > Access: 2025-12-02 19:03:16.577964732 +0000
	I1202 19:07:24.584114   49088 command_runner.go:130] > Modify: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584119   49088 command_runner.go:130] > Change: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584125   49088 command_runner.go:130] >  Birth: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584207   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1202 19:07:24.630311   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.630810   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1202 19:07:24.671995   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.672412   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1202 19:07:24.713648   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.713758   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1202 19:07:24.754977   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.755077   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1202 19:07:24.800995   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.801486   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1202 19:07:24.844718   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.845325   49088 kubeadm.go:401] StartCluster: {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:24.845410   49088 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 19:07:24.845499   49088 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:07:24.875465   49088 cri.go:89] found id: ""
	I1202 19:07:24.875565   49088 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 19:07:24.882887   49088 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1202 19:07:24.882908   49088 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1202 19:07:24.882928   49088 command_runner.go:130] > /var/lib/minikube/etcd:
	I1202 19:07:24.883961   49088 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1202 19:07:24.884012   49088 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1202 19:07:24.884084   49088 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1202 19:07:24.891632   49088 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:07:24.892026   49088 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-449836" does not appear in /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.892129   49088 kubeconfig.go:62] /home/jenkins/minikube-integration/22021-2487/kubeconfig needs updating (will repair): [kubeconfig missing "functional-449836" cluster setting kubeconfig missing "functional-449836" context setting]
	I1202 19:07:24.892546   49088 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.892988   49088 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.893140   49088 kapi.go:59] client config for functional-449836: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 19:07:24.893652   49088 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1202 19:07:24.893721   49088 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1202 19:07:24.893742   49088 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1202 19:07:24.893817   49088 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1202 19:07:24.893840   49088 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1202 19:07:24.893879   49088 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1202 19:07:24.894204   49088 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1202 19:07:24.902267   49088 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1202 19:07:24.902298   49088 kubeadm.go:602] duration metric: took 18.265587ms to restartPrimaryControlPlane
	I1202 19:07:24.902309   49088 kubeadm.go:403] duration metric: took 56.993765ms to StartCluster
	I1202 19:07:24.902355   49088 settings.go:142] acquiring lock: {Name:mka76ea0dcf16fdbb68808885f8360c0083029b4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.902437   49088 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.903036   49088 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.903251   49088 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1202 19:07:24.903573   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:24.903617   49088 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1202 19:07:24.903676   49088 addons.go:70] Setting storage-provisioner=true in profile "functional-449836"
	I1202 19:07:24.903691   49088 addons.go:239] Setting addon storage-provisioner=true in "functional-449836"
	I1202 19:07:24.903717   49088 host.go:66] Checking if "functional-449836" exists ...
	I1202 19:07:24.903830   49088 addons.go:70] Setting default-storageclass=true in profile "functional-449836"
	I1202 19:07:24.903877   49088 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-449836"
	I1202 19:07:24.904207   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.904250   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.909664   49088 out.go:179] * Verifying Kubernetes components...
	I1202 19:07:24.912752   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:24.942660   49088 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 19:07:24.943205   49088 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.943381   49088 kapi.go:59] client config for functional-449836: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 19:07:24.943666   49088 addons.go:239] Setting addon default-storageclass=true in "functional-449836"
	I1202 19:07:24.943695   49088 host.go:66] Checking if "functional-449836" exists ...
	I1202 19:07:24.944105   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.945588   49088 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:24.945617   49088 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1202 19:07:24.945676   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:24.976744   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:24.983018   49088 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:24.983040   49088 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1202 19:07:24.983109   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:25.013238   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:25.139303   49088 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:07:25.147308   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:25.166870   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:25.922715   49088 node_ready.go:35] waiting up to 6m0s for node "functional-449836" to be "Ready" ...
	I1202 19:07:25.922842   49088 type.go:168] "Request Body" body=""
	I1202 19:07:25.922904   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:25.923137   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:25.923161   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923181   49088 retry.go:31] will retry after 314.802872ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923212   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:25.923227   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923235   49088 retry.go:31] will retry after 316.161686ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923297   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:26.238968   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:26.239458   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:26.312262   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.312301   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.312346   49088 retry.go:31] will retry after 358.686092ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.320393   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.320484   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.320525   49088 retry.go:31] will retry after 528.121505ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.423804   49088 type.go:168] "Request Body" body=""
	I1202 19:07:26.423895   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:26.424214   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:26.671815   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:26.745439   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.745497   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.745515   49088 retry.go:31] will retry after 446.477413ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.849789   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:26.909069   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.909108   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.909134   49088 retry.go:31] will retry after 684.877567ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.923341   49088 type.go:168] "Request Body" body=""
	I1202 19:07:26.923433   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:26.923791   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:27.192236   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:27.247207   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:27.250502   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.250546   49088 retry.go:31] will retry after 797.707708ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.423790   49088 type.go:168] "Request Body" body=""
	I1202 19:07:27.423889   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:27.424196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:27.594774   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:27.660877   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:27.660957   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.660987   49088 retry.go:31] will retry after 601.48037ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.923401   49088 type.go:168] "Request Body" body=""
	I1202 19:07:27.923475   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:27.923784   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:27.923848   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:28.049160   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:28.112455   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:28.112493   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.112512   49088 retry.go:31] will retry after 941.564206ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.262919   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:28.323250   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:28.323307   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.323325   49088 retry.go:31] will retry after 741.834409ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.423555   49088 type.go:168] "Request Body" body=""
	I1202 19:07:28.423652   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:28.423971   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:28.923658   49088 type.go:168] "Request Body" body=""
	I1202 19:07:28.923731   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:28.923989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:29.054311   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:29.065740   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:29.126744   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:29.126791   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.126812   49088 retry.go:31] will retry after 2.378740888s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.143543   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:29.143609   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.143631   49088 retry.go:31] will retry after 2.739062704s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:07:29.423043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:29.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:29.923116   49088 type.go:168] "Request Body" body=""
	I1202 19:07:29.923203   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:29.923554   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:30.422925   49088 type.go:168] "Request Body" body=""
	I1202 19:07:30.423004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:30.423294   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:30.423351   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:30.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:07:30.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:30.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:07:31.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:31.423376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.506668   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:31.565098   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:31.565149   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.565168   49088 retry.go:31] will retry after 3.30231188s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.883619   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:31.923118   49088 type.go:168] "Request Body" body=""
	I1202 19:07:31.923190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:31.923483   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.949881   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:31.953682   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.953716   49088 retry.go:31] will retry after 2.323480137s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:32.422997   49088 type.go:168] "Request Body" body=""
	I1202 19:07:32.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:32.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:32.423441   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:32.923115   49088 type.go:168] "Request Body" body=""
	I1202 19:07:32.923193   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:32.923525   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:33.422891   49088 type.go:168] "Request Body" body=""
	I1202 19:07:33.422956   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:33.423209   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:33.922911   49088 type.go:168] "Request Body" body=""
	I1202 19:07:33.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:33.923302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:34.277557   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:34.337253   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:34.337306   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.337326   49088 retry.go:31] will retry after 5.941517157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.423656   49088 type.go:168] "Request Body" body=""
	I1202 19:07:34.423738   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:34.424084   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:34.424136   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:34.867735   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:34.923406   49088 type.go:168] "Request Body" body=""
	I1202 19:07:34.923506   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:34.923762   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:34.931582   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:34.931622   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.931641   49088 retry.go:31] will retry after 5.732328972s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:35.423000   49088 type.go:168] "Request Body" body=""
	I1202 19:07:35.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:35.423385   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:35.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:07:35.923063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:35.923444   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:36.422994   49088 type.go:168] "Request Body" body=""
	I1202 19:07:36.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:36.423390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:36.922999   49088 type.go:168] "Request Body" body=""
	I1202 19:07:36.923077   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:36.923397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:36.923453   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:37.423120   49088 type.go:168] "Request Body" body=""
	I1202 19:07:37.423196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:37.423525   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:37.923076   49088 type.go:168] "Request Body" body=""
	I1202 19:07:37.923148   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:37.923450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:38.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:07:38.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:38.423432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:38.923000   49088 type.go:168] "Request Body" body=""
	I1202 19:07:38.923074   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:38.923410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:39.423757   49088 type.go:168] "Request Body" body=""
	I1202 19:07:39.423827   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:39.424076   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:39.424115   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:39.923939   49088 type.go:168] "Request Body" body=""
	I1202 19:07:39.924013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:39.924357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:40.279081   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:40.340610   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:40.340655   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.340674   49088 retry.go:31] will retry after 7.832295728s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:07:40.423959   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:40.424241   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:40.664676   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:40.720825   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:40.724043   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.724077   49088 retry.go:31] will retry after 3.410570548s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.923400   49088 type.go:168] "Request Body" body=""
	I1202 19:07:40.923497   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:40.923882   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:41.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:07:41.423784   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:41.424115   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:41.424172   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:41.922915   49088 type.go:168] "Request Body" body=""
	I1202 19:07:41.922990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:41.923344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:42.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:07:42.422980   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:42.423254   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:42.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:07:42.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:42.923341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:43.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:07:43.423067   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:43.423429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:43.923715   49088 type.go:168] "Request Body" body=""
	I1202 19:07:43.923780   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:43.924087   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:43.924145   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:44.135480   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:44.194407   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:44.194462   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:44.194482   49088 retry.go:31] will retry after 9.43511002s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:44.423808   49088 type.go:168] "Request Body" body=""
	I1202 19:07:44.423884   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:44.424207   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:44.923173   49088 type.go:168] "Request Body" body=""
	I1202 19:07:44.923287   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:44.923608   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:45.423511   49088 type.go:168] "Request Body" body=""
	I1202 19:07:45.423594   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:45.423852   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:45.923682   49088 type.go:168] "Request Body" body=""
	I1202 19:07:45.923754   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:45.924062   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:46.423867   49088 type.go:168] "Request Body" body=""
	I1202 19:07:46.423945   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:46.424267   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:46.424344   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:46.923022   49088 type.go:168] "Request Body" body=""
	I1202 19:07:46.923087   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:46.923335   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:47.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:07:47.423049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:47.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:47.922938   49088 type.go:168] "Request Body" body=""
	I1202 19:07:47.923018   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:47.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:48.173817   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:48.233696   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:48.233741   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:48.233760   49088 retry.go:31] will retry after 11.915058211s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:48.423860   49088 type.go:168] "Request Body" body=""
	I1202 19:07:48.423931   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:48.424192   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:48.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:07:48.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:48.923338   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:48.923389   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:49.423071   49088 type.go:168] "Request Body" body=""
	I1202 19:07:49.423160   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:49.423457   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:49.923767   49088 type.go:168] "Request Body" body=""
	I1202 19:07:49.923839   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:49.924094   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:50.423628   49088 type.go:168] "Request Body" body=""
	I1202 19:07:50.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:50.424008   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:50.923823   49088 type.go:168] "Request Body" body=""
	I1202 19:07:50.923896   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:50.924199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:50.924253   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:51.423846   49088 type.go:168] "Request Body" body=""
	I1202 19:07:51.423915   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:51.424234   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:51.922982   49088 type.go:168] "Request Body" body=""
	I1202 19:07:51.923118   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:51.923450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:52.423137   49088 type.go:168] "Request Body" body=""
	I1202 19:07:52.423209   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:52.423568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:52.923553   49088 type.go:168] "Request Body" body=""
	I1202 19:07:52.923629   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:52.923890   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:53.423671   49088 type.go:168] "Request Body" body=""
	I1202 19:07:53.423751   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:53.424089   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:53.424151   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:53.630602   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:53.701195   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:53.708777   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:53.708825   49088 retry.go:31] will retry after 18.228322251s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:53.923261   49088 type.go:168] "Request Body" body=""
	I1202 19:07:53.923336   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:53.923674   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:54.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:07:54.422976   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:54.423235   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:54.923162   49088 type.go:168] "Request Body" body=""
	I1202 19:07:54.923249   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:54.923575   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:55.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:07:55.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:55.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:55.922912   49088 type.go:168] "Request Body" body=""
	I1202 19:07:55.922984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:55.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:55.923346   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:56.422984   49088 type.go:168] "Request Body" body=""
	I1202 19:07:56.423058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:56.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:56.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:07:56.923062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:56.923429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:57.423124   49088 type.go:168] "Request Body" body=""
	I1202 19:07:57.423196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:57.423456   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:57.922920   49088 type.go:168] "Request Body" body=""
	I1202 19:07:57.922991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:57.923317   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:57.923373   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:58.422950   49088 type.go:168] "Request Body" body=""
	I1202 19:07:58.423033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:58.423366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:58.923630   49088 type.go:168] "Request Body" body=""
	I1202 19:07:58.923696   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:58.924020   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:59.423807   49088 type.go:168] "Request Body" body=""
	I1202 19:07:59.423887   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:59.424243   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:59.922942   49088 type.go:168] "Request Body" body=""
	I1202 19:07:59.923022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:59.923353   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:59.923410   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:00.150075   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:00.323059   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:00.323111   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:00.323132   49088 retry.go:31] will retry after 12.256345503s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:00.423512   49088 type.go:168] "Request Body" body=""
	I1202 19:08:00.423597   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:00.423977   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:00.923784   49088 type.go:168] "Request Body" body=""
	I1202 19:08:00.923865   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:00.924196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:01.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:08:01.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:01.423304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:01.922933   49088 type.go:168] "Request Body" body=""
	I1202 19:08:01.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:01.923287   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:02.422978   49088 type.go:168] "Request Body" body=""
	I1202 19:08:02.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:02.423379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:02.423436   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:02.923122   49088 type.go:168] "Request Body" body=""
	I1202 19:08:02.923245   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:02.923555   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:03.423814   49088 type.go:168] "Request Body" body=""
	I1202 19:08:03.423889   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:03.424141   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:03.923895   49088 type.go:168] "Request Body" body=""
	I1202 19:08:03.923996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:03.924288   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:04.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:08:04.423083   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:04.423375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:04.922988   49088 type.go:168] "Request Body" body=""
	I1202 19:08:04.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:04.923336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:04.923376   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:05.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:08:05.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:05.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:05.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:08:05.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:05.923359   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:06.423783   49088 type.go:168] "Request Body" body=""
	I1202 19:08:06.423854   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:06.424112   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:06.923877   49088 type.go:168] "Request Body" body=""
	I1202 19:08:06.923974   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:06.924303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:06.924381   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:07.423044   49088 type.go:168] "Request Body" body=""
	I1202 19:08:07.423125   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:07.423474   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:07.922856   49088 type.go:168] "Request Body" body=""
	I1202 19:08:07.922930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:07.923205   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:08.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:08.422992   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:08.423315   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:08.922886   49088 type.go:168] "Request Body" body=""
	I1202 19:08:08.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:08.923313   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:09.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:09.423006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:09.423295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:09.423343   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:09.922894   49088 type.go:168] "Request Body" body=""
	I1202 19:08:09.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:09.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:10.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:08:10.423153   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:10.423491   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:10.923741   49088 type.go:168] "Request Body" body=""
	I1202 19:08:10.923814   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:10.924073   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:11.423834   49088 type.go:168] "Request Body" body=""
	I1202 19:08:11.423907   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:11.424248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:11.424304   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:11.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:08:11.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:11.923342   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:11.937687   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:08:11.996748   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:11.999800   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:11.999828   49088 retry.go:31] will retry after 12.016513449s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.423502   49088 type.go:168] "Request Body" body=""
	I1202 19:08:12.423582   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:12.423831   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:12.580354   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:12.637408   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:12.637456   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.637477   49088 retry.go:31] will retry after 30.215930355s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.923948   49088 type.go:168] "Request Body" body=""
	I1202 19:08:12.924043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:12.924384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:13.422995   49088 type.go:168] "Request Body" body=""
	I1202 19:08:13.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:13.423402   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:13.923854   49088 type.go:168] "Request Body" body=""
	I1202 19:08:13.923924   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:13.924172   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:13.924221   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:14.422931   49088 type.go:168] "Request Body" body=""
	I1202 19:08:14.423013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:14.423360   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:14.923106   49088 type.go:168] "Request Body" body=""
	I1202 19:08:14.923201   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:14.923504   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:15.423455   49088 type.go:168] "Request Body" body=""
	I1202 19:08:15.423543   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:15.423801   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:15.923582   49088 type.go:168] "Request Body" body=""
	I1202 19:08:15.923658   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:15.923982   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:16.423696   49088 type.go:168] "Request Body" body=""
	I1202 19:08:16.423768   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:16.424069   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:16.424123   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:16.923441   49088 type.go:168] "Request Body" body=""
	I1202 19:08:16.923513   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:16.923823   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:17.423623   49088 type.go:168] "Request Body" body=""
	I1202 19:08:17.423715   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:17.424059   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:17.923916   49088 type.go:168] "Request Body" body=""
	I1202 19:08:17.923987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:17.924303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:18.422927   49088 type.go:168] "Request Body" body=""
	I1202 19:08:18.423013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:18.423293   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:18.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:08:18.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:18.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:18.923511   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:19.423204   49088 type.go:168] "Request Body" body=""
	I1202 19:08:19.423280   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:19.423633   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:19.923322   49088 type.go:168] "Request Body" body=""
	I1202 19:08:19.923392   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:19.923647   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:20.423776   49088 type.go:168] "Request Body" body=""
	I1202 19:08:20.423870   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:20.424201   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:20.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:08:20.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:20.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:21.423693   49088 type.go:168] "Request Body" body=""
	I1202 19:08:21.423801   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:21.424068   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:21.424117   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:21.923863   49088 type.go:168] "Request Body" body=""
	I1202 19:08:21.923935   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:21.924262   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:22.422993   49088 type.go:168] "Request Body" body=""
	I1202 19:08:22.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:22.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:22.922927   49088 type.go:168] "Request Body" body=""
	I1202 19:08:22.922998   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:22.923323   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:23.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:08:23.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:23.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:23.922971   49088 type.go:168] "Request Body" body=""
	I1202 19:08:23.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:23.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:23.923391   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:24.016567   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:08:24.078750   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:24.078790   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:24.078809   49088 retry.go:31] will retry after 37.473532818s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:24.423149   49088 type.go:168] "Request Body" body=""
	I1202 19:08:24.423225   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:24.423606   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:24.923585   49088 type.go:168] "Request Body" body=""
	I1202 19:08:24.923686   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:24.924015   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:25.423855   49088 type.go:168] "Request Body" body=""
	I1202 19:08:25.423933   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:25.424248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:25.923542   49088 type.go:168] "Request Body" body=""
	I1202 19:08:25.923615   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:25.923871   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:25.923923   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:26.423702   49088 type.go:168] "Request Body" body=""
	I1202 19:08:26.423799   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:26.424100   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:26.923908   49088 type.go:168] "Request Body" body=""
	I1202 19:08:26.923990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:26.924357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:27.422916   49088 type.go:168] "Request Body" body=""
	I1202 19:08:27.423000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:27.423295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:27.922995   49088 type.go:168] "Request Body" body=""
	I1202 19:08:27.923085   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:27.923386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:28.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:08:28.423198   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:28.423550   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:28.423605   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:28.923207   49088 type.go:168] "Request Body" body=""
	I1202 19:08:28.923280   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:28.923547   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:29.423229   49088 type.go:168] "Request Body" body=""
	I1202 19:08:29.423310   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:29.423621   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:29.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:08:29.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:29.923334   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:30.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:08:30.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:30.423290   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:30.922992   49088 type.go:168] "Request Body" body=""
	I1202 19:08:30.923070   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:30.923374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:30.923423   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:31.422962   49088 type.go:168] "Request Body" body=""
	I1202 19:08:31.423044   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:31.423370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:31.923658   49088 type.go:168] "Request Body" body=""
	I1202 19:08:31.923727   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:31.923984   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:32.423783   49088 type.go:168] "Request Body" body=""
	I1202 19:08:32.423853   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:32.424194   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:32.923864   49088 type.go:168] "Request Body" body=""
	I1202 19:08:32.923952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:32.924274   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:32.924361   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:33.422870   49088 type.go:168] "Request Body" body=""
	I1202 19:08:33.422935   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:33.423233   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:33.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:08:33.923021   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:33.923340   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:34.423053   49088 type.go:168] "Request Body" body=""
	I1202 19:08:34.423137   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:34.423440   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:34.923286   49088 type.go:168] "Request Body" body=""
	I1202 19:08:34.923353   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:34.923610   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:35.423738   49088 type.go:168] "Request Body" body=""
	I1202 19:08:35.423811   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:35.424122   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:35.424178   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:35.923982   49088 type.go:168] "Request Body" body=""
	I1202 19:08:35.924054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:35.924397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:36.423490   49088 type.go:168] "Request Body" body=""
	I1202 19:08:36.423577   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:36.423904   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:36.923609   49088 type.go:168] "Request Body" body=""
	I1202 19:08:36.923698   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:36.924003   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:37.423836   49088 type.go:168] "Request Body" body=""
	I1202 19:08:37.423908   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:37.424273   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:37.424347   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:37.923878   49088 type.go:168] "Request Body" body=""
	I1202 19:08:37.923949   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:37.924222   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:38.422922   49088 type.go:168] "Request Body" body=""
	I1202 19:08:38.422995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:38.423329   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:38.923060   49088 type.go:168] "Request Body" body=""
	I1202 19:08:38.923133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:38.923451   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:39.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:08:39.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:39.423240   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:39.922980   49088 type.go:168] "Request Body" body=""
	I1202 19:08:39.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:39.923354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:39.923403   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:40.423331   49088 type.go:168] "Request Body" body=""
	I1202 19:08:40.423423   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:40.423754   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:40.923541   49088 type.go:168] "Request Body" body=""
	I1202 19:08:40.923652   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:40.923952   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:41.423758   49088 type.go:168] "Request Body" body=""
	I1202 19:08:41.423829   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:41.424159   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:41.922904   49088 type.go:168] "Request Body" body=""
	I1202 19:08:41.922987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:41.923363   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:42.423568   49088 type.go:168] "Request Body" body=""
	I1202 19:08:42.423637   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:42.423879   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:42.423921   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:42.854609   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:42.913285   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:42.916268   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:42.916300   49088 retry.go:31] will retry after 24.794449401s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:42.923470   49088 type.go:168] "Request Body" body=""
	I1202 19:08:42.923553   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:42.923860   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:43.423622   49088 type.go:168] "Request Body" body=""
	I1202 19:08:43.423694   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:43.423983   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:43.923751   49088 type.go:168] "Request Body" body=""
	I1202 19:08:43.923834   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:43.924123   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:44.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:44.423014   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:44.423327   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:44.923006   49088 type.go:168] "Request Body" body=""
	I1202 19:08:44.923080   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:44.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:44.923476   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:45.422929   49088 type.go:168] "Request Body" body=""
	I1202 19:08:45.423000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:45.423274   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:45.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:08:45.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:45.923364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:46.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:08:46.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:46.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:46.922941   49088 type.go:168] "Request Body" body=""
	I1202 19:08:46.923013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:46.923277   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:47.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:08:47.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:47.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:47.423440   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:47.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:08:47.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:47.923383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:48.423521   49088 type.go:168] "Request Body" body=""
	I1202 19:08:48.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:48.424010   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:48.923782   49088 type.go:168] "Request Body" body=""
	I1202 19:08:48.923856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:48.924186   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:49.422919   49088 type.go:168] "Request Body" body=""
	I1202 19:08:49.422992   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:49.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:49.922931   49088 type.go:168] "Request Body" body=""
	I1202 19:08:49.923004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:49.923306   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:49.923362   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:50.423001   49088 type.go:168] "Request Body" body=""
	I1202 19:08:50.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:50.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:50.922993   49088 type.go:168] "Request Body" body=""
	I1202 19:08:50.923072   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:50.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:51.423082   49088 type.go:168] "Request Body" body=""
	I1202 19:08:51.423174   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:51.423497   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:51.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:08:51.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:51.923335   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:51.923382   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:52.423062   49088 type.go:168] "Request Body" body=""
	I1202 19:08:52.423141   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:52.423469   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:52.922906   49088 type.go:168] "Request Body" body=""
	I1202 19:08:52.922982   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:52.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:53.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:08:53.423074   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:53.423405   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:53.923177   49088 type.go:168] "Request Body" body=""
	I1202 19:08:53.923253   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:53.923577   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:53.923645   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:54.422880   49088 type.go:168] "Request Body" body=""
	I1202 19:08:54.422958   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:54.423220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:54.923069   49088 type.go:168] "Request Body" body=""
	I1202 19:08:54.923142   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:54.923466   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:55.423373   49088 type.go:168] "Request Body" body=""
	I1202 19:08:55.423459   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:55.423806   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:55.923603   49088 type.go:168] "Request Body" body=""
	I1202 19:08:55.923681   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:55.923944   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:55.923992   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:56.423750   49088 type.go:168] "Request Body" body=""
	I1202 19:08:56.423821   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:56.424196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:56.922939   49088 type.go:168] "Request Body" body=""
	I1202 19:08:56.923015   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:56.923399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:57.423718   49088 type.go:168] "Request Body" body=""
	I1202 19:08:57.423789   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:57.424085   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:57.923903   49088 type.go:168] "Request Body" body=""
	I1202 19:08:57.923980   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:57.924302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:57.924374   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:58.422958   49088 type.go:168] "Request Body" body=""
	I1202 19:08:58.423030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:58.423367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:58.923777   49088 type.go:168] "Request Body" body=""
	I1202 19:08:58.923851   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:58.924127   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:59.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:08:59.423956   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:59.424305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:59.922904   49088 type.go:168] "Request Body" body=""
	I1202 19:08:59.922978   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:59.923298   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:00.423245   49088 type.go:168] "Request Body" body=""
	I1202 19:09:00.423318   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:00.423619   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:00.423665   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:00.922984   49088 type.go:168] "Request Body" body=""
	I1202 19:09:00.923063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:00.923384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:01.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:09:01.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:01.423390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:01.552630   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:09:01.616821   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:01.616872   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:01.616967   49088 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1202 19:09:01.923268   49088 type.go:168] "Request Body" body=""
	I1202 19:09:01.923333   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:01.923595   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:02.423036   49088 type.go:168] "Request Body" body=""
	I1202 19:09:02.423106   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:02.423425   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:02.922957   49088 type.go:168] "Request Body" body=""
	I1202 19:09:02.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:02.923349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:02.923411   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:03.422870   49088 type.go:168] "Request Body" body=""
	I1202 19:09:03.422937   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:03.423202   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:03.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:03.923048   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:03.923382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:04.422966   49088 type.go:168] "Request Body" body=""
	I1202 19:09:04.423045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:04.423360   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:04.923022   49088 type.go:168] "Request Body" body=""
	I1202 19:09:04.923097   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:04.923357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:05.423319   49088 type.go:168] "Request Body" body=""
	I1202 19:09:05.423392   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:05.423740   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:05.423793   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:05.923326   49088 type.go:168] "Request Body" body=""
	I1202 19:09:05.923409   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:05.923718   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:06.423454   49088 type.go:168] "Request Body" body=""
	I1202 19:09:06.423525   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:06.423826   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:06.923640   49088 type.go:168] "Request Body" body=""
	I1202 19:09:06.923716   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:06.924092   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:07.423755   49088 type.go:168] "Request Body" body=""
	I1202 19:09:07.423833   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:07.424174   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:07.424240   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:07.711667   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:09:07.768083   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:07.771273   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:07.771371   49088 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1202 19:09:07.774489   49088 out.go:179] * Enabled addons: 
	I1202 19:09:07.778178   49088 addons.go:530] duration metric: took 1m42.874553995s for enable addons: enabled=[]
	I1202 19:09:07.923592   49088 type.go:168] "Request Body" body=""
	I1202 19:09:07.923663   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:07.923975   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:08.423753   49088 type.go:168] "Request Body" body=""
	I1202 19:09:08.423867   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:08.424222   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:08.922931   49088 type.go:168] "Request Body" body=""
	I1202 19:09:08.923003   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:08.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:09.423790   49088 type.go:168] "Request Body" body=""
	I1202 19:09:09.423880   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:09.424153   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:09.922907   49088 type.go:168] "Request Body" body=""
	I1202 19:09:09.923001   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:09.923324   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:09.923374   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:10.423145   49088 type.go:168] "Request Body" body=""
	I1202 19:09:10.423260   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:10.423579   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:10.922932   49088 type.go:168] "Request Body" body=""
	I1202 19:09:10.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:10.923400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:11.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:09:11.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:11.423397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:11.923082   49088 type.go:168] "Request Body" body=""
	I1202 19:09:11.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:11.923464   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:11.923521   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:12.422896   49088 type.go:168] "Request Body" body=""
	I1202 19:09:12.422975   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:12.423250   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:12.922964   49088 type.go:168] "Request Body" body=""
	I1202 19:09:12.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:12.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:13.423093   49088 type.go:168] "Request Body" body=""
	I1202 19:09:13.423175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:13.423500   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:13.923207   49088 type.go:168] "Request Body" body=""
	I1202 19:09:13.923272   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:13.923535   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:13.923574   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:14.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:14.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:14.423377   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:14.923293   49088 type.go:168] "Request Body" body=""
	I1202 19:09:14.923367   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:14.923688   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:15.423514   49088 type.go:168] "Request Body" body=""
	I1202 19:09:15.423584   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:15.423882   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:15.923633   49088 type.go:168] "Request Body" body=""
	I1202 19:09:15.923702   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:15.924013   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:15.924083   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:16.423892   49088 type.go:168] "Request Body" body=""
	I1202 19:09:16.423994   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:16.424346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:16.922929   49088 type.go:168] "Request Body" body=""
	I1202 19:09:16.922996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:16.923246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:17.422959   49088 type.go:168] "Request Body" body=""
	I1202 19:09:17.423030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:17.423344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:17.923012   49088 type.go:168] "Request Body" body=""
	I1202 19:09:17.923112   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:17.923445   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:18.423579   49088 type.go:168] "Request Body" body=""
	I1202 19:09:18.423646   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:18.423912   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:18.423954   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:18.923689   49088 type.go:168] "Request Body" body=""
	I1202 19:09:18.923816   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:18.924164   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:19.423865   49088 type.go:168] "Request Body" body=""
	I1202 19:09:19.423938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:19.424264   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:19.922971   49088 type.go:168] "Request Body" body=""
	I1202 19:09:19.923065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:19.928773   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1202 19:09:20.423719   49088 type.go:168] "Request Body" body=""
	I1202 19:09:20.423798   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:20.424108   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:20.424158   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:20.923797   49088 type.go:168] "Request Body" body=""
	I1202 19:09:20.923876   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:20.924234   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:21.423890   49088 type.go:168] "Request Body" body=""
	I1202 19:09:21.423990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:21.424292   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:21.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:09:21.923044   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:21.923382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:22.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:22.423043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:22.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:22.923067   49088 type.go:168] "Request Body" body=""
	I1202 19:09:22.923150   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:22.923449   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:22.923501   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:23.423230   49088 type.go:168] "Request Body" body=""
	I1202 19:09:23.423312   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:23.423745   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:23.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:23.923037   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:23.923364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:24.423610   49088 type.go:168] "Request Body" body=""
	I1202 19:09:24.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:24.423973   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:24.922875   49088 type.go:168] "Request Body" body=""
	I1202 19:09:24.922950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:24.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:25.422971   49088 type.go:168] "Request Body" body=""
	I1202 19:09:25.423045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:25.423397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:25.423453   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:25.923781   49088 type.go:168] "Request Body" body=""
	I1202 19:09:25.923849   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:25.924111   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:26.423856   49088 type.go:168] "Request Body" body=""
	I1202 19:09:26.423928   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:26.424242   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:26.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:26.923039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:26.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:27.423069   49088 type.go:168] "Request Body" body=""
	I1202 19:09:27.423144   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:27.423407   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:27.922946   49088 type.go:168] "Request Body" body=""
	I1202 19:09:27.923018   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:27.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:27.923411   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:28.423076   49088 type.go:168] "Request Body" body=""
	I1202 19:09:28.423152   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:28.423495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:28.923840   49088 type.go:168] "Request Body" body=""
	I1202 19:09:28.923913   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:28.924219   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:29.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:09:29.422989   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:29.423326   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:29.923042   49088 type.go:168] "Request Body" body=""
	I1202 19:09:29.923119   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:29.923480   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:29.923538   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:30.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:09:30.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:30.423371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:30.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:30.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:30.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:31.423083   49088 type.go:168] "Request Body" body=""
	I1202 19:09:31.423155   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:31.423507   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:31.922881   49088 type.go:168] "Request Body" body=""
	I1202 19:09:31.922954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:31.923312   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:32.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:09:32.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:32.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:32.423472   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:32.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:09:32.923050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:32.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:33.423100   49088 type.go:168] "Request Body" body=""
	I1202 19:09:33.423172   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:33.423473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:33.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:09:33.923039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:33.923379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:34.423096   49088 type.go:168] "Request Body" body=""
	I1202 19:09:34.423177   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:34.423484   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:34.423532   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:34.923381   49088 type.go:168] "Request Body" body=""
	I1202 19:09:34.923452   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:34.923763   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:35.423619   49088 type.go:168] "Request Body" body=""
	I1202 19:09:35.423698   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:35.424059   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:35.923863   49088 type.go:168] "Request Body" body=""
	I1202 19:09:35.923933   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:35.924297   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:36.423817   49088 type.go:168] "Request Body" body=""
	I1202 19:09:36.423883   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:36.424153   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:36.424193   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:36.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:09:36.922964   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:36.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:37.422984   49088 type.go:168] "Request Body" body=""
	I1202 19:09:37.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:37.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:37.922914   49088 type.go:168] "Request Body" body=""
	I1202 19:09:37.922987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:37.923253   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:38.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:09:38.423020   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:38.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:38.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:38.923084   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:38.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:38.923459   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:39.423118   49088 type.go:168] "Request Body" body=""
	I1202 19:09:39.423188   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:39.423443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:39.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:09:39.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:39.923390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:40.422957   49088 type.go:168] "Request Body" body=""
	I1202 19:09:40.423031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:40.423438   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:40.922909   49088 type.go:168] "Request Body" body=""
	I1202 19:09:40.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:40.923328   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:41.422954   49088 type.go:168] "Request Body" body=""
	I1202 19:09:41.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:41.423387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:41.423450   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:41.923108   49088 type.go:168] "Request Body" body=""
	I1202 19:09:41.923187   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:41.923536   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:42.423214   49088 type.go:168] "Request Body" body=""
	I1202 19:09:42.423293   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:42.423567   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:42.922993   49088 type.go:168] "Request Body" body=""
	I1202 19:09:42.923073   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:42.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:43.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:09:43.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:43.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:43.923097   49088 type.go:168] "Request Body" body=""
	I1202 19:09:43.923183   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:43.923554   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:43.923610   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:44.422960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:44.423031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:44.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:44.923133   49088 type.go:168] "Request Body" body=""
	I1202 19:09:44.923217   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:44.923568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:45.422996   49088 type.go:168] "Request Body" body=""
	I1202 19:09:45.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:45.423325   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:45.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:09:45.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:45.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:46.422965   49088 type.go:168] "Request Body" body=""
	I1202 19:09:46.423041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:46.423338   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:46.423383   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:46.923662   49088 type.go:168] "Request Body" body=""
	I1202 19:09:46.923729   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:46.923996   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:47.423794   49088 type.go:168] "Request Body" body=""
	I1202 19:09:47.423868   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:47.424199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:47.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:09:47.922970   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:47.923290   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:48.423862   49088 type.go:168] "Request Body" body=""
	I1202 19:09:48.423930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:48.424197   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:48.424242   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:48.922882   49088 type.go:168] "Request Body" body=""
	I1202 19:09:48.922964   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:48.923305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:49.422874   49088 type.go:168] "Request Body" body=""
	I1202 19:09:49.422952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:49.423501   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:49.923227   49088 type.go:168] "Request Body" body=""
	I1202 19:09:49.923298   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:49.923571   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:50.423530   49088 type.go:168] "Request Body" body=""
	I1202 19:09:50.423605   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:50.423930   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:50.923709   49088 type.go:168] "Request Body" body=""
	I1202 19:09:50.923791   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:50.924129   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:50.924183   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:51.423574   49088 type.go:168] "Request Body" body=""
	I1202 19:09:51.423645   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:51.423989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:51.923773   49088 type.go:168] "Request Body" body=""
	I1202 19:09:51.923846   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:51.924175   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:52.423792   49088 type.go:168] "Request Body" body=""
	I1202 19:09:52.423865   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:52.424194   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:52.923791   49088 type.go:168] "Request Body" body=""
	I1202 19:09:52.923863   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:52.924133   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:53.423850   49088 type.go:168] "Request Body" body=""
	I1202 19:09:53.423929   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:53.424252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:53.424366   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:53.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:09:53.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:53.923376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:54.422922   49088 type.go:168] "Request Body" body=""
	I1202 19:09:54.422999   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:54.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:54.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:09:54.923175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:54.923548   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:55.422950   49088 type.go:168] "Request Body" body=""
	I1202 19:09:55.423041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:55.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:55.923676   49088 type.go:168] "Request Body" body=""
	I1202 19:09:55.923766   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:55.924024   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:55.924066   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:56.423823   49088 type.go:168] "Request Body" body=""
	I1202 19:09:56.423900   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:56.424217   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:56.922947   49088 type.go:168] "Request Body" body=""
	I1202 19:09:56.923022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:56.923366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:57.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:09:57.422984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:57.423240   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:57.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:09:57.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:57.923361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:58.422959   49088 type.go:168] "Request Body" body=""
	I1202 19:09:58.423033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:58.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:58.423443   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:58.923738   49088 type.go:168] "Request Body" body=""
	I1202 19:09:58.923812   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:58.924072   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:59.423859   49088 type.go:168] "Request Body" body=""
	I1202 19:09:59.423937   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:59.424270   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:59.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:09:59.923052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:59.923398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:00.435478   49088 type.go:168] "Request Body" body=""
	I1202 19:10:00.435562   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:00.435862   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:00.435913   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:00.923635   49088 type.go:168] "Request Body" body=""
	I1202 19:10:00.923715   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:00.924056   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:01.423705   49088 type.go:168] "Request Body" body=""
	I1202 19:10:01.423779   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:01.424081   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:01.923812   49088 type.go:168] "Request Body" body=""
	I1202 19:10:01.923878   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:01.924156   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:02.423889   49088 type.go:168] "Request Body" body=""
	I1202 19:10:02.423969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:02.424269   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:02.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:02.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:02.923443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:02.923500   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:03.423151   49088 type.go:168] "Request Body" body=""
	I1202 19:10:03.423226   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:03.423486   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:03.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:10:03.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:03.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:04.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:10:04.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:04.423412   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:04.923351   49088 type.go:168] "Request Body" body=""
	I1202 19:10:04.923425   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:04.923691   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:04.923736   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:05.423841   49088 type.go:168] "Request Body" body=""
	I1202 19:10:05.423932   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:05.424309   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:05.922992   49088 type.go:168] "Request Body" body=""
	I1202 19:10:05.923078   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:05.923444   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:06.423123   49088 type.go:168] "Request Body" body=""
	I1202 19:10:06.423232   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:06.423495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:06.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:10:06.923050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:06.923408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:07.422988   49088 type.go:168] "Request Body" body=""
	I1202 19:10:07.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:07.423489   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:07.423547   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:07.923028   49088 type.go:168] "Request Body" body=""
	I1202 19:10:07.923107   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:07.923442   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:08.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:08.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:08.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:08.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:10:08.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:08.923365   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:09.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:10:09.423094   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:09.423439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:09.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:10:09.923062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:09.923410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:09.923465   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:10.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:10:10.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:10.423387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:10.922955   49088 type.go:168] "Request Body" body=""
	I1202 19:10:10.923025   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:10.923302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:11.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:11.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:11.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:11.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:10:11.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:11.923379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:12.422929   49088 type.go:168] "Request Body" body=""
	I1202 19:10:12.423003   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:12.423285   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:12.423327   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:12.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:10:12.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:12.923388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:13.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:13.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:13.423417   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:13.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:10:13.922997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:13.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:14.422948   49088 type.go:168] "Request Body" body=""
	I1202 19:10:14.423026   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:14.423361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:14.423418   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:14.923139   49088 type.go:168] "Request Body" body=""
	I1202 19:10:14.923221   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:14.923551   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:15.423337   49088 type.go:168] "Request Body" body=""
	I1202 19:10:15.423420   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:15.423733   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:15.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:10:15.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:15.923380   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:16.422956   49088 type.go:168] "Request Body" body=""
	I1202 19:10:16.423036   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:16.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:16.923752   49088 type.go:168] "Request Body" body=""
	I1202 19:10:16.923818   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:16.924073   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:16.924112   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:17.423826   49088 type.go:168] "Request Body" body=""
	I1202 19:10:17.423915   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:17.424256   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:17.922988   49088 type.go:168] "Request Body" body=""
	I1202 19:10:17.923068   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:17.923403   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:18.422876   49088 type.go:168] "Request Body" body=""
	I1202 19:10:18.422953   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:18.423220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:18.922913   49088 type.go:168] "Request Body" body=""
	I1202 19:10:18.922988   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:18.923326   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:19.423037   49088 type.go:168] "Request Body" body=""
	I1202 19:10:19.423111   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:19.423450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:19.423505   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:19.923780   49088 type.go:168] "Request Body" body=""
	I1202 19:10:19.923847   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:19.924112   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:20.423105   49088 type.go:168] "Request Body" body=""
	I1202 19:10:20.423212   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:20.423516   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:20.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:10:20.923060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:20.923378   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:21.423065   49088 type.go:168] "Request Body" body=""
	I1202 19:10:21.423140   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:21.423414   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:21.923020   49088 type.go:168] "Request Body" body=""
	I1202 19:10:21.923093   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:21.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:21.923415   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:22.423093   49088 type.go:168] "Request Body" body=""
	I1202 19:10:22.423166   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:22.423511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:22.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:10:22.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:22.923434   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:23.423060   49088 type.go:168] "Request Body" body=""
	I1202 19:10:23.423133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:23.423446   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:23.923195   49088 type.go:168] "Request Body" body=""
	I1202 19:10:23.923269   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:23.923577   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:23.923622   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:24.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:10:24.422957   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:24.423230   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:24.923115   49088 type.go:168] "Request Body" body=""
	I1202 19:10:24.923194   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:24.923600   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:25.423537   49088 type.go:168] "Request Body" body=""
	I1202 19:10:25.423610   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:25.423949   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:25.923703   49088 type.go:168] "Request Body" body=""
	I1202 19:10:25.923775   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:25.924043   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:25.924103   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:26.423902   49088 type.go:168] "Request Body" body=""
	I1202 19:10:26.423982   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:26.424292   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:26.922962   49088 type.go:168] "Request Body" body=""
	I1202 19:10:26.923073   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:26.923396   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:27.423706   49088 type.go:168] "Request Body" body=""
	I1202 19:10:27.423778   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:27.424090   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:27.923873   49088 type.go:168] "Request Body" body=""
	I1202 19:10:27.923954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:27.924307   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:27.924397   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:28.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:10:28.423017   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:28.423349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:28.922918   49088 type.go:168] "Request Body" body=""
	I1202 19:10:28.922988   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:28.923270   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:29.422990   49088 type.go:168] "Request Body" body=""
	I1202 19:10:29.423072   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:29.423426   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:29.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:10:29.923053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:29.923386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:30.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:10:30.423008   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:30.423306   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:30.423392   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:30.923054   49088 type.go:168] "Request Body" body=""
	I1202 19:10:30.923133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:30.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:31.423123   49088 type.go:168] "Request Body" body=""
	I1202 19:10:31.423203   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:31.423539   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:31.923853   49088 type.go:168] "Request Body" body=""
	I1202 19:10:31.923920   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:31.924180   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:32.422889   49088 type.go:168] "Request Body" body=""
	I1202 19:10:32.422970   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:32.423319   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:32.922910   49088 type.go:168] "Request Body" body=""
	I1202 19:10:32.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:32.923321   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:32.923375   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:33.423014   49088 type.go:168] "Request Body" body=""
	I1202 19:10:33.423095   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:33.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:33.923072   49088 type.go:168] "Request Body" body=""
	I1202 19:10:33.923148   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:33.923513   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:34.423108   49088 type.go:168] "Request Body" body=""
	I1202 19:10:34.423190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:34.423541   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:34.923286   49088 type.go:168] "Request Body" body=""
	I1202 19:10:34.923355   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:34.923630   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:34.923672   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:35.423752   49088 type.go:168] "Request Body" body=""
	I1202 19:10:35.423834   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:35.424190   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:35.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:10:35.922967   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:35.923295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:36.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:10:36.423054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:36.423305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:36.923037   49088 type.go:168] "Request Body" body=""
	I1202 19:10:36.923115   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:36.923442   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:37.423029   49088 type.go:168] "Request Body" body=""
	I1202 19:10:37.423102   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:37.423425   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:37.423480   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:37.923773   49088 type.go:168] "Request Body" body=""
	I1202 19:10:37.923861   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:37.924136   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:38.423899   49088 type.go:168] "Request Body" body=""
	I1202 19:10:38.423979   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:38.424296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:38.922970   49088 type.go:168] "Request Body" body=""
	I1202 19:10:38.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:38.923372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:39.423713   49088 type.go:168] "Request Body" body=""
	I1202 19:10:39.423780   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:39.424040   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:39.424081   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:39.923835   49088 type.go:168] "Request Body" body=""
	I1202 19:10:39.923908   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:39.924227   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:40.423209   49088 type.go:168] "Request Body" body=""
	I1202 19:10:40.423286   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:40.423612   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:40.923000   49088 type.go:168] "Request Body" body=""
	I1202 19:10:40.923083   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:40.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:41.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:41.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:41.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:41.922974   49088 type.go:168] "Request Body" body=""
	I1202 19:10:41.923048   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:41.923367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:41.923430   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:42.422892   49088 type.go:168] "Request Body" body=""
	I1202 19:10:42.422960   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:42.423218   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:42.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:10:42.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:42.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:43.422967   49088 type.go:168] "Request Body" body=""
	I1202 19:10:43.423039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:43.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:43.922911   49088 type.go:168] "Request Body" body=""
	I1202 19:10:43.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:43.923286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:44.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:44.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:44.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:44.423441   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:44.923165   49088 type.go:168] "Request Body" body=""
	I1202 19:10:44.923245   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:44.923572   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:45.423613   49088 type.go:168] "Request Body" body=""
	I1202 19:10:45.423695   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:45.423958   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:45.924133   49088 type.go:168] "Request Body" body=""
	I1202 19:10:45.924208   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:45.924557   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:46.423281   49088 type.go:168] "Request Body" body=""
	I1202 19:10:46.423365   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:46.423700   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:46.423760   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:46.923435   49088 type.go:168] "Request Body" body=""
	I1202 19:10:46.923504   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:46.923772   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:47.423542   49088 type.go:168] "Request Body" body=""
	I1202 19:10:47.423616   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:47.423946   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:47.923718   49088 type.go:168] "Request Body" body=""
	I1202 19:10:47.923790   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:47.924128   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:48.423446   49088 type.go:168] "Request Body" body=""
	I1202 19:10:48.423517   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:48.423772   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:48.423814   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:48.923540   49088 type.go:168] "Request Body" body=""
	I1202 19:10:48.923616   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:48.923948   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:49.423625   49088 type.go:168] "Request Body" body=""
	I1202 19:10:49.423703   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:49.424044   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:49.923749   49088 type.go:168] "Request Body" body=""
	I1202 19:10:49.923817   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:49.924118   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:50.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:10:50.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:50.423386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:50.923086   49088 type.go:168] "Request Body" body=""
	I1202 19:10:50.923163   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:50.923498   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:50.923558   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:51.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:10:51.422947   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:51.423236   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:51.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:51.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:51.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:52.423002   49088 type.go:168] "Request Body" body=""
	I1202 19:10:52.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:52.423410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:52.922879   49088 type.go:168] "Request Body" body=""
	I1202 19:10:52.922948   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:52.923224   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:53.422960   49088 type.go:168] "Request Body" body=""
	I1202 19:10:53.423034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:53.423361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:53.423412   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:53.923077   49088 type.go:168] "Request Body" body=""
	I1202 19:10:53.923157   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:53.923495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:54.422917   49088 type.go:168] "Request Body" body=""
	I1202 19:10:54.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:54.423246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:54.923135   49088 type.go:168] "Request Body" body=""
	I1202 19:10:54.923208   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:54.923556   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:55.423559   49088 type.go:168] "Request Body" body=""
	I1202 19:10:55.423636   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:55.423971   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:55.424022   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:55.923605   49088 type.go:168] "Request Body" body=""
	I1202 19:10:55.923678   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:55.923946   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:56.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:10:56.423787   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:56.424129   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:56.923785   49088 type.go:168] "Request Body" body=""
	I1202 19:10:56.923856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:56.924173   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:57.423656   49088 type.go:168] "Request Body" body=""
	I1202 19:10:57.423734   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:57.423996   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:57.923693   49088 type.go:168] "Request Body" body=""
	I1202 19:10:57.923764   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:57.924078   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:57.924131   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:58.423895   49088 type.go:168] "Request Body" body=""
	I1202 19:10:58.423973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:58.424286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:58.922875   49088 type.go:168] "Request Body" body=""
	I1202 19:10:58.922950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:58.923200   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:59.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:59.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:59.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:59.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:59.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:59.923368   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:00.423287   49088 type.go:168] "Request Body" body=""
	I1202 19:11:00.423360   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:00.423657   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:00.423700   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:00.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:11:00.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:00.923393   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:01.423104   49088 type.go:168] "Request Body" body=""
	I1202 19:11:01.423186   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:01.423527   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:01.923028   49088 type.go:168] "Request Body" body=""
	I1202 19:11:01.923097   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:01.923355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:02.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:11:02.423036   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:02.423393   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:02.923093   49088 type.go:168] "Request Body" body=""
	I1202 19:11:02.923169   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:02.923483   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:02.923543   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:03.422923   49088 type.go:168] "Request Body" body=""
	I1202 19:11:03.422993   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:03.423275   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:03.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:03.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:03.923401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:04.423109   49088 type.go:168] "Request Body" body=""
	I1202 19:11:04.423183   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:04.423480   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:04.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:11:04.923029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:04.923291   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:05.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:11:05.423019   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:05.423416   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:05.423485   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:05.922975   49088 type.go:168] "Request Body" body=""
	I1202 19:11:05.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:05.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:06.423068   49088 type.go:168] "Request Body" body=""
	I1202 19:11:06.423143   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:06.423404   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:06.922954   49088 type.go:168] "Request Body" body=""
	I1202 19:11:06.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:06.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:07.423090   49088 type.go:168] "Request Body" body=""
	I1202 19:11:07.423173   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:07.423511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:07.423577   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:07.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:11:07.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:07.923477   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:08.423196   49088 type.go:168] "Request Body" body=""
	I1202 19:11:08.423268   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:08.423618   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:08.923317   49088 type.go:168] "Request Body" body=""
	I1202 19:11:08.923395   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:08.923714   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:09.423471   49088 type.go:168] "Request Body" body=""
	I1202 19:11:09.423536   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:09.423793   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:09.423831   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:09.923592   49088 type.go:168] "Request Body" body=""
	I1202 19:11:09.923672   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:09.923995   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:10.422883   49088 type.go:168] "Request Body" body=""
	I1202 19:11:10.422965   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:10.423316   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:10.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:10.923035   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:10.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:11.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:11:11.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:11.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:11.923095   49088 type.go:168] "Request Body" body=""
	I1202 19:11:11.923169   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:11.923521   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:11.923576   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:12.423858   49088 type.go:168] "Request Body" body=""
	I1202 19:11:12.423929   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:12.424221   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:12.922920   49088 type.go:168] "Request Body" body=""
	I1202 19:11:12.923007   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:12.923376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:13.423103   49088 type.go:168] "Request Body" body=""
	I1202 19:11:13.423187   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:13.423573   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:13.923844   49088 type.go:168] "Request Body" body=""
	I1202 19:11:13.923913   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:13.924261   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:13.924357   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:14.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:11:14.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:14.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:14.923176   49088 type.go:168] "Request Body" body=""
	I1202 19:11:14.923259   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:14.923607   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:15.423538   49088 type.go:168] "Request Body" body=""
	I1202 19:11:15.423610   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:15.423918   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:15.923744   49088 type.go:168] "Request Body" body=""
	I1202 19:11:15.923820   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:15.924119   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:16.423897   49088 type.go:168] "Request Body" body=""
	I1202 19:11:16.423968   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:16.424281   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:16.424354   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:16.922924   49088 type.go:168] "Request Body" body=""
	I1202 19:11:16.923006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:16.923265   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:17.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:11:17.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:17.423367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:17.922982   49088 type.go:168] "Request Body" body=""
	I1202 19:11:17.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:17.923356   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:18.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:11:18.422949   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:18.423201   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:18.922936   49088 type.go:168] "Request Body" body=""
	I1202 19:11:18.923013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:18.923346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:18.923404   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:19.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:11:19.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:19.423374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:19.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:11:19.923009   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:19.923288   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:20.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:11:20.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:20.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:20.923042   49088 type.go:168] "Request Body" body=""
	I1202 19:11:20.923120   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:20.923474   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:20.923528   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:21.423170   49088 type.go:168] "Request Body" body=""
	I1202 19:11:21.423246   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:21.423533   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:21.922987   49088 type.go:168] "Request Body" body=""
	I1202 19:11:21.923069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:21.923428   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:22.423136   49088 type.go:168] "Request Body" body=""
	I1202 19:11:22.423218   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:22.423580   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:22.923816   49088 type.go:168] "Request Body" body=""
	I1202 19:11:22.923890   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:22.924148   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:22.924186   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:23.422866   49088 type.go:168] "Request Body" body=""
	I1202 19:11:23.422936   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:23.423286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:23.923262   49088 type.go:168] "Request Body" body=""
	I1202 19:11:23.923352   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:23.923994   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:24.422942   49088 type.go:168] "Request Body" body=""
	I1202 19:11:24.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:24.423374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:24.922937   49088 type.go:168] "Request Body" body=""
	I1202 19:11:24.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:24.923429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:25.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:11:25.423008   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:25.423357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:25.423414   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:25.922926   49088 type.go:168] "Request Body" body=""
	I1202 19:11:25.922991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:25.923253   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:26.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:11:26.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:26.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:26.923094   49088 type.go:168] "Request Body" body=""
	I1202 19:11:26.923165   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:26.923469   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:27.422920   49088 type.go:168] "Request Body" body=""
	I1202 19:11:27.422991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:27.423278   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:27.922888   49088 type.go:168] "Request Body" body=""
	I1202 19:11:27.922961   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:27.923314   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:27.923368   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:28.422912   49088 type.go:168] "Request Body" body=""
	I1202 19:11:28.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:28.423375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:28.923772   49088 type.go:168] "Request Body" body=""
	I1202 19:11:28.923838   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:28.924083   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:29.423850   49088 type.go:168] "Request Body" body=""
	I1202 19:11:29.423927   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:29.424260   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:29.923896   49088 type.go:168] "Request Body" body=""
	I1202 19:11:29.923968   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:29.924284   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:29.924358   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:30.423879   49088 type.go:168] "Request Body" body=""
	I1202 19:11:30.423953   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:30.424220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:30.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:11:30.923004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:30.923350   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:31.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:11:31.423147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:31.423507   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:31.922953   49088 type.go:168] "Request Body" body=""
	I1202 19:11:31.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:31.923337   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:32.422971   49088 type.go:168] "Request Body" body=""
	I1202 19:11:32.423046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:32.423366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:32.423417   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:32.923034   49088 type.go:168] "Request Body" body=""
	I1202 19:11:32.923109   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:32.923438   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:33.423135   49088 type.go:168] "Request Body" body=""
	I1202 19:11:33.423204   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:33.423464   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:33.922974   49088 type.go:168] "Request Body" body=""
	I1202 19:11:33.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:33.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:34.423091   49088 type.go:168] "Request Body" body=""
	I1202 19:11:34.423168   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:34.423523   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:34.423580   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:34.923239   49088 type.go:168] "Request Body" body=""
	I1202 19:11:34.923307   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:34.923573   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:35.423667   49088 type.go:168] "Request Body" body=""
	I1202 19:11:35.423743   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:35.424088   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:35.923861   49088 type.go:168] "Request Body" body=""
	I1202 19:11:35.923938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:35.924296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:36.422936   49088 type.go:168] "Request Body" body=""
	I1202 19:11:36.423001   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:36.423257   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:36.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:11:36.923029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:36.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:36.923429   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:37.422938   49088 type.go:168] "Request Body" body=""
	I1202 19:11:37.423016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:37.423347   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:37.923043   49088 type.go:168] "Request Body" body=""
	I1202 19:11:37.923123   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:37.923400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:38.422941   49088 type.go:168] "Request Body" body=""
	I1202 19:11:38.423011   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:38.423321   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:38.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:11:38.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:38.923352   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:39.422867   49088 type.go:168] "Request Body" body=""
	I1202 19:11:39.422947   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:39.423239   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:39.423286   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:39.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:11:39.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:39.923373   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:40.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:11:40.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:40.423412   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:40.923647   49088 type.go:168] "Request Body" body=""
	I1202 19:11:40.923717   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:40.923978   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:41.423742   49088 type.go:168] "Request Body" body=""
	I1202 19:11:41.423821   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:41.424168   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:41.424221   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:41.922872   49088 type.go:168] "Request Body" body=""
	I1202 19:11:41.922945   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:41.923310   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:42.423001   49088 type.go:168] "Request Body" body=""
	I1202 19:11:42.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:42.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:42.922973   49088 type.go:168] "Request Body" body=""
	I1202 19:11:42.923054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:42.923395   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:43.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:11:43.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:43.423368   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:43.922915   49088 type.go:168] "Request Body" body=""
	I1202 19:11:43.922986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:43.923252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:43.923292   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:44.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:11:44.423058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:44.423399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:44.923181   49088 type.go:168] "Request Body" body=""
	I1202 19:11:44.923261   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:44.923585   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:45.423566   49088 type.go:168] "Request Body" body=""
	I1202 19:11:45.423644   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:45.423912   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:45.923624   49088 type.go:168] "Request Body" body=""
	I1202 19:11:45.923703   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:45.924023   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:45.924071   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:46.423663   49088 type.go:168] "Request Body" body=""
	I1202 19:11:46.423734   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:46.424056   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:46.923091   49088 type.go:168] "Request Body" body=""
	I1202 19:11:46.923157   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:46.923456   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:47.423153   49088 type.go:168] "Request Body" body=""
	I1202 19:11:47.423230   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:47.423569   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:47.923278   49088 type.go:168] "Request Body" body=""
	I1202 19:11:47.923362   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:47.923689   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:48.422896   49088 type.go:168] "Request Body" body=""
	I1202 19:11:48.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:48.423231   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:48.423281   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:48.922955   49088 type.go:168] "Request Body" body=""
	I1202 19:11:48.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:48.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:49.423094   49088 type.go:168] "Request Body" body=""
	I1202 19:11:49.423172   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:49.423529   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:49.923206   49088 type.go:168] "Request Body" body=""
	I1202 19:11:49.923275   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:49.923532   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:50.423633   49088 type.go:168] "Request Body" body=""
	I1202 19:11:50.423711   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:50.424022   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:50.424076   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:50.923805   49088 type.go:168] "Request Body" body=""
	I1202 19:11:50.923878   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:50.924219   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:51.423760   49088 type.go:168] "Request Body" body=""
	I1202 19:11:51.423833   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:51.424113   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:51.923870   49088 type.go:168] "Request Body" body=""
	I1202 19:11:51.923950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:51.924286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:52.422868   49088 type.go:168] "Request Body" body=""
	I1202 19:11:52.422952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:52.423285   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:52.923892   49088 type.go:168] "Request Body" body=""
	I1202 19:11:52.923962   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:52.924248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:52.924291   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:53.422923   49088 type.go:168] "Request Body" body=""
	I1202 19:11:53.423002   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:53.423357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:53.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:11:53.923041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:53.923384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:54.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:11:54.422991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:54.423296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:54.922927   49088 type.go:168] "Request Body" body=""
	I1202 19:11:54.923011   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:54.923324   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:55.423007   49088 type.go:168] "Request Body" body=""
	I1202 19:11:55.423084   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:55.423406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:55.423459   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:55.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:55.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:55.923318   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:56.423006   49088 type.go:168] "Request Body" body=""
	I1202 19:11:56.423078   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:56.423413   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:56.923138   49088 type.go:168] "Request Body" body=""
	I1202 19:11:56.923215   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:56.923566   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:57.423251   49088 type.go:168] "Request Body" body=""
	I1202 19:11:57.423331   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:57.423624   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:57.423682   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:57.923555   49088 type.go:168] "Request Body" body=""
	I1202 19:11:57.923629   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:57.923962   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:58.423744   49088 type.go:168] "Request Body" body=""
	I1202 19:11:58.423824   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:58.424157   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:58.923666   49088 type.go:168] "Request Body" body=""
	I1202 19:11:58.923737   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:58.924002   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:59.423779   49088 type.go:168] "Request Body" body=""
	I1202 19:11:59.423856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:59.424204   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:59.424262   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:59.922940   49088 type.go:168] "Request Body" body=""
	I1202 19:11:59.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:59.923350   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:00.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:12:00.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:00.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:00.923010   49088 type.go:168] "Request Body" body=""
	I1202 19:12:00.923091   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:00.923432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:01.423015   49088 type.go:168] "Request Body" body=""
	I1202 19:12:01.423098   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:01.423440   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:01.923901   49088 type.go:168] "Request Body" body=""
	I1202 19:12:01.923978   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:01.924301   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:01.924373   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:02.423047   49088 type.go:168] "Request Body" body=""
	I1202 19:12:02.423120   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:02.423479   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:02.923197   49088 type.go:168] "Request Body" body=""
	I1202 19:12:02.923275   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:02.923599   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:03.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:12:03.422989   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:03.423273   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:03.922947   49088 type.go:168] "Request Body" body=""
	I1202 19:12:03.923023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:03.923352   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:04.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:12:04.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:04.423399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:04.423455   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:04.923130   49088 type.go:168] "Request Body" body=""
	I1202 19:12:04.923196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:04.923453   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:05.423379   49088 type.go:168] "Request Body" body=""
	I1202 19:12:05.423457   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:05.423797   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:05.923568   49088 type.go:168] "Request Body" body=""
	I1202 19:12:05.923643   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:05.923966   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:06.423488   49088 type.go:168] "Request Body" body=""
	I1202 19:12:06.423556   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:06.423829   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:06.423875   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:06.923674   49088 type.go:168] "Request Body" body=""
	I1202 19:12:06.923754   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:06.924076   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:07.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:12:07.423954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:07.424301   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:07.923933   49088 type.go:168] "Request Body" body=""
	I1202 19:12:07.924013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:07.924280   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:08.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:12:08.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:08.423411   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:08.923117   49088 type.go:168] "Request Body" body=""
	I1202 19:12:08.923190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:08.923511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:08.923573   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:09.422927   49088 type.go:168] "Request Body" body=""
	I1202 19:12:09.422996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:09.423311   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:09.923024   49088 type.go:168] "Request Body" body=""
	I1202 19:12:09.923100   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:09.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:10.422995   49088 type.go:168] "Request Body" body=""
	I1202 19:12:10.423070   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:10.423406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:10.922922   49088 type.go:168] "Request Body" body=""
	I1202 19:12:10.922997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:10.923336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:11.422925   49088 type.go:168] "Request Body" body=""
	I1202 19:12:11.423002   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:11.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:11.423398   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:11.923073   49088 type.go:168] "Request Body" body=""
	I1202 19:12:11.923147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:11.923465   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:12.422920   49088 type.go:168] "Request Body" body=""
	I1202 19:12:12.422996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:12.423266   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:12.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:12:12.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:12.923408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:13.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:12:13.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:13.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:13.423447   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:13.923740   49088 type.go:168] "Request Body" body=""
	I1202 19:12:13.923807   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:13.924068   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:14.423836   49088 type.go:168] "Request Body" body=""
	I1202 19:12:14.423917   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:14.424266   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:14.922896   49088 type.go:168] "Request Body" body=""
	I1202 19:12:14.922969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:14.923318   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:15.423109   49088 type.go:168] "Request Body" body=""
	I1202 19:12:15.423181   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:15.423435   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:15.423486   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:15.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:12:15.923045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:15.923399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:16.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:12:16.423056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:16.423395   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:16.922950   49088 type.go:168] "Request Body" body=""
	I1202 19:12:16.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:16.923357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:17.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:12:17.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:17.423403   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:17.923097   49088 type.go:168] "Request Body" body=""
	I1202 19:12:17.923175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:17.923503   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:17.923560   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:18.423204   49088 type.go:168] "Request Body" body=""
	I1202 19:12:18.423269   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:18.423531   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:18.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:12:18.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:18.923383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:19.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:12:19.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:19.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:19.923083   49088 type.go:168] "Request Body" body=""
	I1202 19:12:19.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:19.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:20.423485   49088 type.go:168] "Request Body" body=""
	I1202 19:12:20.423567   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:20.423913   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:20.423967   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:20.923722   49088 type.go:168] "Request Body" body=""
	I1202 19:12:20.923795   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:20.924168   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:21.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:12:21.422998   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:21.423294   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:21.922979   49088 type.go:168] "Request Body" body=""
	I1202 19:12:21.923058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:21.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:22.422989   49088 type.go:168] "Request Body" body=""
	I1202 19:12:22.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:22.423414   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:22.923563   49088 type.go:168] "Request Body" body=""
	I1202 19:12:22.923637   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:22.923893   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:22.923932   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:23.423651   49088 type.go:168] "Request Body" body=""
	I1202 19:12:23.423730   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:23.424077   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:23.923725   49088 type.go:168] "Request Body" body=""
	I1202 19:12:23.923795   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:23.924130   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:24.423644   49088 type.go:168] "Request Body" body=""
	I1202 19:12:24.423724   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:24.424004   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:24.923060   49088 type.go:168] "Request Body" body=""
	I1202 19:12:24.923142   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:24.923487   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:25.423281   49088 type.go:168] "Request Body" body=""
	I1202 19:12:25.423364   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:25.423721   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:25.423779   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:25.923482   49088 type.go:168] "Request Body" body=""
	I1202 19:12:25.923550   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:25.923808   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:26.423542   49088 type.go:168] "Request Body" body=""
	I1202 19:12:26.423613   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:26.423935   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:26.923724   49088 type.go:168] "Request Body" body=""
	I1202 19:12:26.923807   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:26.924187   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:27.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:12:27.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:27.423252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:27.922940   49088 type.go:168] "Request Body" body=""
	I1202 19:12:27.923019   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:27.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:27.923403   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:28.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:12:28.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:28.423432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:28.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:12:28.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:28.923349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:29.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:12:29.423092   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:29.423421   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:29.923070   49088 type.go:168] "Request Body" body=""
	I1202 19:12:29.923147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:29.923486   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:29.923558   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:30.423578   49088 type.go:168] "Request Body" body=""
	I1202 19:12:30.423659   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:30.423950   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:30.923208   49088 type.go:168] "Request Body" body=""
	I1202 19:12:30.923281   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:30.923639   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:31.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:12:31.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:31.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:31.922934   49088 type.go:168] "Request Body" body=""
	I1202 19:12:31.923000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:31.923257   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:32.422953   49088 type.go:168] "Request Body" body=""
	I1202 19:12:32.423028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:32.423354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:32.423413   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:32.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:12:32.923168   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:32.923564   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:33.423262   49088 type.go:168] "Request Body" body=""
	I1202 19:12:33.423340   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:33.423598   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:33.922973   49088 type.go:168] "Request Body" body=""
	I1202 19:12:33.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:33.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:34.422949   49088 type.go:168] "Request Body" body=""
	I1202 19:12:34.423022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:34.423336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:34.922878   49088 type.go:168] "Request Body" body=""
	I1202 19:12:34.922943   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:34.923200   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:34.923239   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:35.423165   49088 type.go:168] "Request Body" body=""
	I1202 19:12:35.423238   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:35.423568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:35.923256   49088 type.go:168] "Request Body" body=""
	I1202 19:12:35.923329   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:35.923637   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:36.422898   49088 type.go:168] "Request Body" body=""
	I1202 19:12:36.422965   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:36.423218   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:36.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:12:36.923035   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:36.923355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:36.923409   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:37.423086   49088 type.go:168] "Request Body" body=""
	I1202 19:12:37.423161   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:37.423449   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:37.923834   49088 type.go:168] "Request Body" body=""
	I1202 19:12:37.923903   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:37.924159   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:38.422906   49088 type.go:168] "Request Body" body=""
	I1202 19:12:38.422977   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:38.423261   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:38.922970   49088 type.go:168] "Request Body" body=""
	I1202 19:12:38.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:38.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:38.923440   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:39.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:12:39.423788   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:39.424096   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:39.923879   49088 type.go:168] "Request Body" body=""
	I1202 19:12:39.923950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:39.924267   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:40.423003   49088 type.go:168] "Request Body" body=""
	I1202 19:12:40.423081   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:40.423385   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:40.923058   49088 type.go:168] "Request Body" body=""
	I1202 19:12:40.923129   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:40.923426   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:40.923486   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:41.422953   49088 type.go:168] "Request Body" body=""
	I1202 19:12:41.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:41.423343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:41.923078   49088 type.go:168] "Request Body" body=""
	I1202 19:12:41.923156   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:41.923473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:42.423139   49088 type.go:168] "Request Body" body=""
	I1202 19:12:42.423204   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:42.423502   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:42.923011   49088 type.go:168] "Request Body" body=""
	I1202 19:12:42.923090   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:42.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:43.423000   49088 type.go:168] "Request Body" body=""
	I1202 19:12:43.423079   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:43.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:43.423446   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:43.923623   49088 type.go:168] "Request Body" body=""
	I1202 19:12:43.923696   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:43.923958   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:44.423707   49088 type.go:168] "Request Body" body=""
	I1202 19:12:44.423805   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:44.424192   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:44.923041   49088 type.go:168] "Request Body" body=""
	I1202 19:12:44.923114   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:44.923460   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:45.423487   49088 type.go:168] "Request Body" body=""
	I1202 19:12:45.423555   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:45.423816   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:45.423856   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:45.923602   49088 type.go:168] "Request Body" body=""
	I1202 19:12:45.923678   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:45.924003   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:46.423772   49088 type.go:168] "Request Body" body=""
	I1202 19:12:46.423843   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:46.424193   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:46.923702   49088 type.go:168] "Request Body" body=""
	I1202 19:12:46.923775   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:46.924028   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:47.423742   49088 type.go:168] "Request Body" body=""
	I1202 19:12:47.423811   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:47.424126   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:47.424184   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:47.923682   49088 type.go:168] "Request Body" body=""
	I1202 19:12:47.923759   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:47.924070   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:48.423650   49088 type.go:168] "Request Body" body=""
	I1202 19:12:48.423719   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:48.423981   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:48.923704   49088 type.go:168] "Request Body" body=""
	I1202 19:12:48.923774   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:48.924090   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:49.423900   49088 type.go:168] "Request Body" body=""
	I1202 19:12:49.423983   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:49.424354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:49.424408   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:49.922950   49088 type.go:168] "Request Body" body=""
	I1202 19:12:49.923023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:49.923365   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:50.423243   49088 type.go:168] "Request Body" body=""
	I1202 19:12:50.423322   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:50.423653   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:50.922964   49088 type.go:168] "Request Body" body=""
	I1202 19:12:50.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:50.923377   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:51.422954   49088 type.go:168] "Request Body" body=""
	I1202 19:12:51.423023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:51.423346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:51.923047   49088 type.go:168] "Request Body" body=""
	I1202 19:12:51.923126   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:51.923460   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:51.923513   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:52.422937   49088 type.go:168] "Request Body" body=""
	I1202 19:12:52.423014   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:52.423303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:52.922929   49088 type.go:168] "Request Body" body=""
	I1202 19:12:52.923016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:52.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:53.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:12:53.423100   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:53.423482   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:53.922987   49088 type.go:168] "Request Body" body=""
	I1202 19:12:53.923061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:53.923413   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:54.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:12:54.422983   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:54.423246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:54.423296   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:54.923072   49088 type.go:168] "Request Body" body=""
	I1202 19:12:54.923153   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:54.923523   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:55.423518   49088 type.go:168] "Request Body" body=""
	I1202 19:12:55.423603   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:55.423968   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:55.923602   49088 type.go:168] "Request Body" body=""
	I1202 19:12:55.923672   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:55.924000   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:56.423806   49088 type.go:168] "Request Body" body=""
	I1202 19:12:56.423881   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:56.424245   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:56.424298   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:56.923893   49088 type.go:168] "Request Body" body=""
	I1202 19:12:56.923967   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:56.924355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:57.422930   49088 type.go:168] "Request Body" body=""
	I1202 19:12:57.422997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:57.423287   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:57.922958   49088 type.go:168] "Request Body" body=""
	I1202 19:12:57.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:57.923341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:58.423065   49088 type.go:168] "Request Body" body=""
	I1202 19:12:58.423137   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:58.423498   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:58.923185   49088 type.go:168] "Request Body" body=""
	I1202 19:12:58.923255   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:58.923518   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:58.923557   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:59.422988   49088 type.go:168] "Request Body" body=""
	I1202 19:12:59.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:59.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:59.923116   49088 type.go:168] "Request Body" body=""
	I1202 19:12:59.923196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:59.923537   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:00.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:13:00.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:00.423421   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:00.922957   49088 type.go:168] "Request Body" body=""
	I1202 19:13:00.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:00.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:01.423025   49088 type.go:168] "Request Body" body=""
	I1202 19:13:01.423098   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:01.423429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:01.423485   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:01.923135   49088 type.go:168] "Request Body" body=""
	I1202 19:13:01.923210   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:01.923470   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:02.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:13:02.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:02.423386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:02.923090   49088 type.go:168] "Request Body" body=""
	I1202 19:13:02.923194   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:02.923514   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:03.423202   49088 type.go:168] "Request Body" body=""
	I1202 19:13:03.423271   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:03.423580   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:03.423632   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:03.922949   49088 type.go:168] "Request Body" body=""
	I1202 19:13:03.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:03.923344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:04.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:13:04.423049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:04.423409   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:04.923130   49088 type.go:168] "Request Body" body=""
	I1202 19:13:04.923198   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:04.923485   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:05.423469   49088 type.go:168] "Request Body" body=""
	I1202 19:13:05.423540   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:05.423887   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:05.423943   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:05.923727   49088 type.go:168] "Request Body" body=""
	I1202 19:13:05.923801   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:05.924115   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:06.423862   49088 type.go:168] "Request Body" body=""
	I1202 19:13:06.423938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:06.424199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:06.922912   49088 type.go:168] "Request Body" body=""
	I1202 19:13:06.922984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:06.923325   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:07.422978   49088 type.go:168] "Request Body" body=""
	I1202 19:13:07.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:07.423373   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:07.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:13:07.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:07.923258   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:07.923298   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:08.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:13:08.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:08.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:08.922933   49088 type.go:168] "Request Body" body=""
	I1202 19:13:08.923006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:08.923329   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:09.422864   49088 type.go:168] "Request Body" body=""
	I1202 19:13:09.422931   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:09.423213   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:09.922936   49088 type.go:168] "Request Body" body=""
	I1202 19:13:09.923016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:09.923330   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:09.923387   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:10.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:13:10.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:10.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:10.922900   49088 type.go:168] "Request Body" body=""
	I1202 19:13:10.922972   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:10.923227   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:11.422967   49088 type.go:168] "Request Body" body=""
	I1202 19:13:11.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:11.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:11.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:13:11.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:11.923347   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:12.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:13:12.422981   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:12.423291   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:12.423344   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:12.922990   49088 type.go:168] "Request Body" body=""
	I1202 19:13:12.923081   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:12.923443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:13.423156   49088 type.go:168] "Request Body" body=""
	I1202 19:13:13.423234   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:13.423549   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:13.922900   49088 type.go:168] "Request Body" body=""
	I1202 19:13:13.922969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:13.923235   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:14.422957   49088 type.go:168] "Request Body" body=""
	I1202 19:13:14.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:14.423396   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:14.423449   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:14.923039   49088 type.go:168] "Request Body" body=""
	I1202 19:13:14.923111   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:14.923397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:15.423223   49088 type.go:168] "Request Body" body=""
	I1202 19:13:15.423302   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:15.423557   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:15.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:13:15.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:15.923367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:16.423062   49088 type.go:168] "Request Body" body=""
	I1202 19:13:16.423140   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:16.423473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:16.423529   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:16.923839   49088 type.go:168] "Request Body" body=""
	I1202 19:13:16.923917   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:16.924188   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:17.422894   49088 type.go:168] "Request Body" body=""
	I1202 19:13:17.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:17.423308   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:17.922909   49088 type.go:168] "Request Body" body=""
	I1202 19:13:17.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:17.923348   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:18.423042   49088 type.go:168] "Request Body" body=""
	I1202 19:13:18.423112   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:18.423378   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:18.923054   49088 type.go:168] "Request Body" body=""
	I1202 19:13:18.923129   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:18.923451   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:18.923507   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:19.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:13:19.423063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:19.423405   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:19.923563   49088 type.go:168] "Request Body" body=""
	I1202 19:13:19.923634   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:19.923942   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:20.423766   49088 type.go:168] "Request Body" body=""
	I1202 19:13:20.423836   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:20.424183   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:20.922889   49088 type.go:168] "Request Body" body=""
	I1202 19:13:20.922963   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:20.923286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:21.422910   49088 type.go:168] "Request Body" body=""
	I1202 19:13:21.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:21.423265   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:21.423304   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:21.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:13:21.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:21.923372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:22.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:13:22.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:22.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:22.923660   49088 type.go:168] "Request Body" body=""
	I1202 19:13:22.923726   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:22.923989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:23.423852   49088 type.go:168] "Request Body" body=""
	I1202 19:13:23.423930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:23.424355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:23.424410   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:23.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:13:23.923088   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:23.923457   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:24.423150   49088 type.go:168] "Request Body" body=""
	I1202 19:13:24.423233   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:24.423566   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:24.923507   49088 type.go:168] "Request Body" body=""
	I1202 19:13:24.923591   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:24.923948   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:25.423717   49088 type.go:168] "Request Body" body=""
	I1202 19:13:25.423797   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:25.424161   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:25.922841   49088 node_ready.go:38] duration metric: took 6m0.000085627s for node "functional-449836" to be "Ready" ...
	I1202 19:13:25.925875   49088 out.go:203] 
	W1202 19:13:25.928738   49088 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1202 19:13:25.928760   49088 out.go:285] * 
	W1202 19:13:25.930899   49088 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 19:13:25.934748   49088 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.957986214Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.957997226Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958008417Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958018034Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958036085Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958054506Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958070481Z" level=info msg="runtime interface created"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958076692Z" level=info msg="created NRI interface"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958098690Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958133217Z" level=info msg="Connect containerd service"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958460151Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.959114281Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.969745294Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.969836502Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.969857893Z" level=info msg="Start subscribing containerd event"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.969939477Z" level=info msg="Start recovering state"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992144454Z" level=info msg="Start event monitor"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992422247Z" level=info msg="Start cni network conf syncer for default"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992500294Z" level=info msg="Start streaming server"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992620565Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992683392Z" level=info msg="runtime interface starting up..."
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992736816Z" level=info msg="starting plugins..."
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992796598Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 02 19:07:22 functional-449836 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.994992948Z" level=info msg="containerd successfully booted in 0.056864s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:13:27.890906    9019 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:27.891738    9019 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:27.893775    9019 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:27.894433    9019 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:27.896131    9019 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:13:27 up 55 min,  0 user,  load average: 0.32, 0.32, 0.51
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:13:24 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:25 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 809.
	Dec 02 19:13:25 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:25 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:25 functional-449836 kubelet[8903]: E1202 19:13:25.213331    8903 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:25 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:25 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:25 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 810.
	Dec 02 19:13:25 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:25 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:26 functional-449836 kubelet[8909]: E1202 19:13:26.018418    8909 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:26 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:26 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:26 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 811.
	Dec 02 19:13:26 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:26 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:26 functional-449836 kubelet[8915]: E1202 19:13:26.753510    8915 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:26 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:26 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:27 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 812.
	Dec 02 19:13:27 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:27 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:27 functional-449836 kubelet[8936]: E1202 19:13:27.487415    8936 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:27 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:27 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (397.108184ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/SoftStart (369.03s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods (2.32s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-449836 get po -A
functional_test.go:711: (dbg) Non-zero exit: kubectl --context functional-449836 get po -A: exit status 1 (64.026565ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:713: failed to get kubectl pods: args "kubectl --context functional-449836 get po -A" : exit status 1
functional_test.go:717: expected stderr to be empty but got *"The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?\n"*: args "kubectl --context functional-449836 get po -A"
functional_test.go:720: expected stdout to include *kube-system* but got *""*. args: "kubectl --context functional-449836 get po -A"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (315.263015ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-449836 logs -n 25: (1.049569178s)
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ ssh            │ functional-224594 ssh sudo cat /etc/ssl/certs/44352.pem                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image load --daemon kicbase/echo-server:functional-224594 --alsologtostderr                                                                   │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh sudo cat /usr/share/ca-certificates/44352.pem                                                                                             │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh sudo cat /etc/test/nested/copy/4435/hosts                                                                                                 │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image save kicbase/echo-server:functional-224594 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image rm kicbase/echo-server:functional-224594 --alsologtostderr                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ update-context │ functional-224594 update-context --alsologtostderr -v=2                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image save --daemon kicbase/echo-server:functional-224594 --alsologtostderr                                                                   │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ update-context │ functional-224594 update-context --alsologtostderr -v=2                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ update-context │ functional-224594 update-context --alsologtostderr -v=2                                                                                                         │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls --format short --alsologtostderr                                                                                                     │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls --format yaml --alsologtostderr                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh            │ functional-224594 ssh pgrep buildkitd                                                                                                                           │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ image          │ functional-224594 image ls --format json --alsologtostderr                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls --format table --alsologtostderr                                                                                                     │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image build -t localhost/my-image:functional-224594 testdata/build --alsologtostderr                                                          │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image          │ functional-224594 image ls                                                                                                                                      │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ delete         │ -p functional-224594                                                                                                                                            │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ start          │ -p functional-449836 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0         │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ start          │ -p functional-449836 --alsologtostderr -v=8                                                                                                                     │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:07 UTC │                     │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:07:19
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:07:19.929855   49088 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:07:19.930082   49088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:07:19.930109   49088 out.go:374] Setting ErrFile to fd 2...
	I1202 19:07:19.930127   49088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:07:19.930424   49088 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:07:19.930829   49088 out.go:368] Setting JSON to false
	I1202 19:07:19.931678   49088 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":2976,"bootTime":1764699464,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:07:19.931776   49088 start.go:143] virtualization:  
	I1202 19:07:19.935245   49088 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:07:19.939094   49088 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:07:19.939188   49088 notify.go:221] Checking for updates...
	I1202 19:07:19.944799   49088 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:07:19.947646   49088 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:19.950501   49088 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:07:19.953361   49088 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:07:19.956281   49088 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:07:19.959695   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:19.959887   49088 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:07:19.996438   49088 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:07:19.996577   49088 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:07:20.063124   49088 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:07:20.053388152 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:07:20.063232   49088 docker.go:319] overlay module found
	I1202 19:07:20.066390   49088 out.go:179] * Using the docker driver based on existing profile
	I1202 19:07:20.069271   49088 start.go:309] selected driver: docker
	I1202 19:07:20.069311   49088 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:20.069422   49088 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:07:20.069541   49088 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:07:20.132012   49088 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:07:20.122627931 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:07:20.132615   49088 cni.go:84] Creating CNI manager for ""
	I1202 19:07:20.132692   49088 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:07:20.132751   49088 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:20.135845   49088 out.go:179] * Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	I1202 19:07:20.138639   49088 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 19:07:20.141498   49088 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 19:07:20.144479   49088 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 19:07:20.144604   49088 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:07:20.163347   49088 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 19:07:20.163372   49088 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 19:07:20.218193   49088 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 19:07:20.422833   49088 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 19:07:20.423042   49088 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 19:07:20.423128   49088 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423219   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 19:07:20.423234   49088 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 122.125µs
	I1202 19:07:20.423249   49088 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 19:07:20.423267   49088 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423303   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 19:07:20.423312   49088 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 47.557µs
	I1202 19:07:20.423318   49088 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423331   49088 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423365   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 19:07:20.423374   49088 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 44.415µs
	I1202 19:07:20.423380   49088 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423395   49088 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423422   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 19:07:20.423432   49088 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 37.579µs
	I1202 19:07:20.423438   49088 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423447   49088 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423476   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 19:07:20.423484   49088 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 38.933µs
	I1202 19:07:20.423490   49088 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423510   49088 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423540   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 19:07:20.423549   49088 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 40.796µs
	I1202 19:07:20.423555   49088 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 19:07:20.423569   49088 cache.go:243] Successfully downloaded all kic artifacts
	I1202 19:07:20.423588   49088 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423620   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 19:07:20.423629   49088 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 42.487µs
	I1202 19:07:20.423635   49088 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 19:07:20.423646   49088 start.go:360] acquireMachinesLock for functional-449836: {Name:mk8999fdfa518fc15358d07431fe9bec286a035e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423706   49088 start.go:364] duration metric: took 31.868µs to acquireMachinesLock for "functional-449836"
	I1202 19:07:20.423570   49088 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423753   49088 start.go:96] Skipping create...Using existing machine configuration
	I1202 19:07:20.423783   49088 fix.go:54] fixHost starting: 
	I1202 19:07:20.423759   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 19:07:20.423888   49088 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 323.2µs
	I1202 19:07:20.423896   49088 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 19:07:20.423906   49088 cache.go:87] Successfully saved all images to host disk.
	I1202 19:07:20.424111   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:20.441213   49088 fix.go:112] recreateIfNeeded on functional-449836: state=Running err=<nil>
	W1202 19:07:20.441244   49088 fix.go:138] unexpected machine state, will restart: <nil>
	I1202 19:07:20.444707   49088 out.go:252] * Updating the running docker "functional-449836" container ...
	I1202 19:07:20.444749   49088 machine.go:94] provisionDockerMachine start ...
	I1202 19:07:20.444842   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.461943   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.462269   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.462284   49088 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 19:07:20.612055   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:07:20.612125   49088 ubuntu.go:182] provisioning hostname "functional-449836"
	I1202 19:07:20.612222   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.629856   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.630166   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.630180   49088 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-449836 && echo "functional-449836" | sudo tee /etc/hostname
	I1202 19:07:20.793419   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:07:20.793536   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.812441   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.812754   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.812775   49088 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-449836' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-449836/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-449836' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 19:07:20.961443   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 19:07:20.961480   49088 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 19:07:20.961539   49088 ubuntu.go:190] setting up certificates
	I1202 19:07:20.961556   49088 provision.go:84] configureAuth start
	I1202 19:07:20.961634   49088 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:07:20.990731   49088 provision.go:143] copyHostCerts
	I1202 19:07:20.990790   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:07:20.990838   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 19:07:20.990856   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:07:20.990938   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 19:07:20.991037   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:07:20.991060   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 19:07:20.991069   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:07:20.991098   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 19:07:20.991189   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:07:20.991211   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 19:07:20.991220   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:07:20.991247   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 19:07:20.991297   49088 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.functional-449836 san=[127.0.0.1 192.168.49.2 functional-449836 localhost minikube]
	I1202 19:07:21.335552   49088 provision.go:177] copyRemoteCerts
	I1202 19:07:21.335618   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 19:07:21.335658   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.354079   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.460475   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1202 19:07:21.460535   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1202 19:07:21.478965   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1202 19:07:21.479028   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 19:07:21.497363   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1202 19:07:21.497471   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 19:07:21.514946   49088 provision.go:87] duration metric: took 553.36724ms to configureAuth
	I1202 19:07:21.515020   49088 ubuntu.go:206] setting minikube options for container-runtime
	I1202 19:07:21.515215   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:21.515248   49088 machine.go:97] duration metric: took 1.070490831s to provisionDockerMachine
	I1202 19:07:21.515264   49088 start.go:293] postStartSetup for "functional-449836" (driver="docker")
	I1202 19:07:21.515276   49088 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 19:07:21.515329   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 19:07:21.515382   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.532644   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.636416   49088 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 19:07:21.639685   49088 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1202 19:07:21.639756   49088 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1202 19:07:21.639777   49088 command_runner.go:130] > VERSION_ID="12"
	I1202 19:07:21.639798   49088 command_runner.go:130] > VERSION="12 (bookworm)"
	I1202 19:07:21.639827   49088 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1202 19:07:21.639832   49088 command_runner.go:130] > ID=debian
	I1202 19:07:21.639847   49088 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1202 19:07:21.639859   49088 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1202 19:07:21.639866   49088 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1202 19:07:21.639943   49088 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 19:07:21.639962   49088 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 19:07:21.639974   49088 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 19:07:21.640036   49088 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 19:07:21.640112   49088 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 19:07:21.640123   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> /etc/ssl/certs/44352.pem
	I1202 19:07:21.640204   49088 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> hosts in /etc/test/nested/copy/4435
	I1202 19:07:21.640213   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> /etc/test/nested/copy/4435/hosts
	I1202 19:07:21.640263   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4435
	I1202 19:07:21.647807   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:07:21.664872   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts --> /etc/test/nested/copy/4435/hosts (40 bytes)
	I1202 19:07:21.686465   49088 start.go:296] duration metric: took 171.184702ms for postStartSetup
	I1202 19:07:21.686545   49088 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:07:21.686646   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.708068   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.808826   49088 command_runner.go:130] > 18%
	I1202 19:07:21.809461   49088 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 19:07:21.814183   49088 command_runner.go:130] > 159G
	I1202 19:07:21.814719   49088 fix.go:56] duration metric: took 1.390932828s for fixHost
	I1202 19:07:21.814741   49088 start.go:83] releasing machines lock for "functional-449836", held for 1.391011327s
	I1202 19:07:21.814809   49088 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:07:21.831833   49088 ssh_runner.go:195] Run: cat /version.json
	I1202 19:07:21.831895   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.832169   49088 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 19:07:21.832229   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.852617   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.855772   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.955939   49088 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1764169655-21974", "minikube_version": "v1.37.0", "commit": "5499406178e21d60d74d327c9716de794e8a4797"}
	I1202 19:07:21.956090   49088 ssh_runner.go:195] Run: systemctl --version
	I1202 19:07:22.048548   49088 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1202 19:07:22.051368   49088 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1202 19:07:22.051402   49088 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1202 19:07:22.051488   49088 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1202 19:07:22.055900   49088 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1202 19:07:22.056072   49088 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 19:07:22.056144   49088 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 19:07:22.064483   49088 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1202 19:07:22.064507   49088 start.go:496] detecting cgroup driver to use...
	I1202 19:07:22.064540   49088 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 19:07:22.064608   49088 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 19:07:22.080944   49088 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 19:07:22.094328   49088 docker.go:218] disabling cri-docker service (if available) ...
	I1202 19:07:22.094412   49088 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 19:07:22.110538   49088 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 19:07:22.123916   49088 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 19:07:22.251555   49088 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 19:07:22.372403   49088 docker.go:234] disabling docker service ...
	I1202 19:07:22.372547   49088 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 19:07:22.390362   49088 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 19:07:22.404129   49088 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 19:07:22.527674   49088 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 19:07:22.641245   49088 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 19:07:22.654510   49088 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 19:07:22.669149   49088 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1202 19:07:22.670616   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 19:07:22.680782   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 19:07:22.690619   49088 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 19:07:22.690690   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 19:07:22.700650   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:07:22.710637   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 19:07:22.720237   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:07:22.730375   49088 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 19:07:22.738458   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 19:07:22.747256   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 19:07:22.756269   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 19:07:22.765824   49088 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 19:07:22.772632   49088 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1202 19:07:22.773683   49088 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 19:07:22.781384   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:22.894036   49088 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 19:07:22.996092   49088 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 19:07:22.996190   49088 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 19:07:23.000049   49088 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1202 19:07:23.000075   49088 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1202 19:07:23.000083   49088 command_runner.go:130] > Device: 0,72	Inode: 1611        Links: 1
	I1202 19:07:23.000090   49088 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1202 19:07:23.000119   49088 command_runner.go:130] > Access: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000134   49088 command_runner.go:130] > Modify: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000139   49088 command_runner.go:130] > Change: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000143   49088 command_runner.go:130] >  Birth: -
	I1202 19:07:23.000708   49088 start.go:564] Will wait 60s for crictl version
	I1202 19:07:23.000798   49088 ssh_runner.go:195] Run: which crictl
	I1202 19:07:23.004553   49088 command_runner.go:130] > /usr/local/bin/crictl
	I1202 19:07:23.004698   49088 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 19:07:23.031006   49088 command_runner.go:130] > Version:  0.1.0
	I1202 19:07:23.031142   49088 command_runner.go:130] > RuntimeName:  containerd
	I1202 19:07:23.031156   49088 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1202 19:07:23.031165   49088 command_runner.go:130] > RuntimeApiVersion:  v1
	I1202 19:07:23.033497   49088 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 19:07:23.033588   49088 ssh_runner.go:195] Run: containerd --version
	I1202 19:07:23.053512   49088 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1202 19:07:23.055064   49088 ssh_runner.go:195] Run: containerd --version
	I1202 19:07:23.073280   49088 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1202 19:07:23.080684   49088 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 19:07:23.083736   49088 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 19:07:23.100485   49088 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1202 19:07:23.104603   49088 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1202 19:07:23.104709   49088 kubeadm.go:884] updating cluster {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 19:07:23.104831   49088 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:07:23.104890   49088 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 19:07:23.127690   49088 command_runner.go:130] > {
	I1202 19:07:23.127710   49088 command_runner.go:130] >   "images":  [
	I1202 19:07:23.127715   49088 command_runner.go:130] >     {
	I1202 19:07:23.127725   49088 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1202 19:07:23.127729   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127744   49088 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1202 19:07:23.127750   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127755   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127759   49088 command_runner.go:130] >       "size":  "8032639",
	I1202 19:07:23.127765   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127776   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127781   49088 command_runner.go:130] >     },
	I1202 19:07:23.127784   49088 command_runner.go:130] >     {
	I1202 19:07:23.127792   49088 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1202 19:07:23.127800   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127806   49088 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1202 19:07:23.127813   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127817   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127822   49088 command_runner.go:130] >       "size":  "21166088",
	I1202 19:07:23.127826   49088 command_runner.go:130] >       "username":  "nonroot",
	I1202 19:07:23.127832   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127835   49088 command_runner.go:130] >     },
	I1202 19:07:23.127838   49088 command_runner.go:130] >     {
	I1202 19:07:23.127845   49088 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1202 19:07:23.127855   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127869   49088 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1202 19:07:23.127876   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127880   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127887   49088 command_runner.go:130] >       "size":  "21134420",
	I1202 19:07:23.127892   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.127899   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.127903   49088 command_runner.go:130] >       },
	I1202 19:07:23.127907   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127911   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127917   49088 command_runner.go:130] >     },
	I1202 19:07:23.127919   49088 command_runner.go:130] >     {
	I1202 19:07:23.127926   49088 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1202 19:07:23.127930   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127938   49088 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1202 19:07:23.127945   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127949   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127953   49088 command_runner.go:130] >       "size":  "24676285",
	I1202 19:07:23.127961   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.127965   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.127971   49088 command_runner.go:130] >       },
	I1202 19:07:23.127975   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127983   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127987   49088 command_runner.go:130] >     },
	I1202 19:07:23.127996   49088 command_runner.go:130] >     {
	I1202 19:07:23.128002   49088 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1202 19:07:23.128006   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128012   49088 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1202 19:07:23.128015   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128019   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128026   49088 command_runner.go:130] >       "size":  "20658969",
	I1202 19:07:23.128029   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128033   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.128041   49088 command_runner.go:130] >       },
	I1202 19:07:23.128052   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128059   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128063   49088 command_runner.go:130] >     },
	I1202 19:07:23.128070   49088 command_runner.go:130] >     {
	I1202 19:07:23.128077   49088 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1202 19:07:23.128081   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128088   49088 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1202 19:07:23.128092   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128096   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128099   49088 command_runner.go:130] >       "size":  "22428165",
	I1202 19:07:23.128103   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128109   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128113   49088 command_runner.go:130] >     },
	I1202 19:07:23.128116   49088 command_runner.go:130] >     {
	I1202 19:07:23.128123   49088 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1202 19:07:23.128130   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128135   49088 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1202 19:07:23.128143   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128152   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128160   49088 command_runner.go:130] >       "size":  "15389290",
	I1202 19:07:23.128163   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128167   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.128170   49088 command_runner.go:130] >       },
	I1202 19:07:23.128175   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128179   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128185   49088 command_runner.go:130] >     },
	I1202 19:07:23.128188   49088 command_runner.go:130] >     {
	I1202 19:07:23.128199   49088 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1202 19:07:23.128203   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128212   49088 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1202 19:07:23.128215   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128223   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128227   49088 command_runner.go:130] >       "size":  "265458",
	I1202 19:07:23.128238   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128243   49088 command_runner.go:130] >         "value":  "65535"
	I1202 19:07:23.128248   49088 command_runner.go:130] >       },
	I1202 19:07:23.128252   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128256   49088 command_runner.go:130] >       "pinned":  true
	I1202 19:07:23.128259   49088 command_runner.go:130] >     }
	I1202 19:07:23.128262   49088 command_runner.go:130] >   ]
	I1202 19:07:23.128265   49088 command_runner.go:130] > }
	I1202 19:07:23.130379   49088 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 19:07:23.130403   49088 cache_images.go:86] Images are preloaded, skipping loading
	I1202 19:07:23.130410   49088 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1202 19:07:23.130509   49088 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-449836 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 19:07:23.130576   49088 ssh_runner.go:195] Run: sudo crictl info
	I1202 19:07:23.152707   49088 command_runner.go:130] > {
	I1202 19:07:23.152731   49088 command_runner.go:130] >   "cniconfig": {
	I1202 19:07:23.152737   49088 command_runner.go:130] >     "Networks": [
	I1202 19:07:23.152741   49088 command_runner.go:130] >       {
	I1202 19:07:23.152746   49088 command_runner.go:130] >         "Config": {
	I1202 19:07:23.152752   49088 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1202 19:07:23.152758   49088 command_runner.go:130] >           "Name": "cni-loopback",
	I1202 19:07:23.152768   49088 command_runner.go:130] >           "Plugins": [
	I1202 19:07:23.152775   49088 command_runner.go:130] >             {
	I1202 19:07:23.152779   49088 command_runner.go:130] >               "Network": {
	I1202 19:07:23.152784   49088 command_runner.go:130] >                 "ipam": {},
	I1202 19:07:23.152789   49088 command_runner.go:130] >                 "type": "loopback"
	I1202 19:07:23.152798   49088 command_runner.go:130] >               },
	I1202 19:07:23.152803   49088 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1202 19:07:23.152810   49088 command_runner.go:130] >             }
	I1202 19:07:23.152814   49088 command_runner.go:130] >           ],
	I1202 19:07:23.152828   49088 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1202 19:07:23.152835   49088 command_runner.go:130] >         },
	I1202 19:07:23.152840   49088 command_runner.go:130] >         "IFName": "lo"
	I1202 19:07:23.152847   49088 command_runner.go:130] >       }
	I1202 19:07:23.152850   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152855   49088 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1202 19:07:23.152860   49088 command_runner.go:130] >     "PluginDirs": [
	I1202 19:07:23.152865   49088 command_runner.go:130] >       "/opt/cni/bin"
	I1202 19:07:23.152869   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152873   49088 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1202 19:07:23.152879   49088 command_runner.go:130] >     "Prefix": "eth"
	I1202 19:07:23.152883   49088 command_runner.go:130] >   },
	I1202 19:07:23.152891   49088 command_runner.go:130] >   "config": {
	I1202 19:07:23.152894   49088 command_runner.go:130] >     "cdiSpecDirs": [
	I1202 19:07:23.152898   49088 command_runner.go:130] >       "/etc/cdi",
	I1202 19:07:23.152907   49088 command_runner.go:130] >       "/var/run/cdi"
	I1202 19:07:23.152910   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152917   49088 command_runner.go:130] >     "cni": {
	I1202 19:07:23.152921   49088 command_runner.go:130] >       "binDir": "",
	I1202 19:07:23.152928   49088 command_runner.go:130] >       "binDirs": [
	I1202 19:07:23.152933   49088 command_runner.go:130] >         "/opt/cni/bin"
	I1202 19:07:23.152936   49088 command_runner.go:130] >       ],
	I1202 19:07:23.152941   49088 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1202 19:07:23.152947   49088 command_runner.go:130] >       "confTemplate": "",
	I1202 19:07:23.152954   49088 command_runner.go:130] >       "ipPref": "",
	I1202 19:07:23.152958   49088 command_runner.go:130] >       "maxConfNum": 1,
	I1202 19:07:23.152963   49088 command_runner.go:130] >       "setupSerially": false,
	I1202 19:07:23.152969   49088 command_runner.go:130] >       "useInternalLoopback": false
	I1202 19:07:23.152977   49088 command_runner.go:130] >     },
	I1202 19:07:23.152983   49088 command_runner.go:130] >     "containerd": {
	I1202 19:07:23.152992   49088 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1202 19:07:23.152997   49088 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1202 19:07:23.153006   49088 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1202 19:07:23.153010   49088 command_runner.go:130] >       "runtimes": {
	I1202 19:07:23.153017   49088 command_runner.go:130] >         "runc": {
	I1202 19:07:23.153022   49088 command_runner.go:130] >           "ContainerAnnotations": null,
	I1202 19:07:23.153026   49088 command_runner.go:130] >           "PodAnnotations": null,
	I1202 19:07:23.153031   49088 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1202 19:07:23.153035   49088 command_runner.go:130] >           "cgroupWritable": false,
	I1202 19:07:23.153041   49088 command_runner.go:130] >           "cniConfDir": "",
	I1202 19:07:23.153046   49088 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1202 19:07:23.153053   49088 command_runner.go:130] >           "io_type": "",
	I1202 19:07:23.153058   49088 command_runner.go:130] >           "options": {
	I1202 19:07:23.153066   49088 command_runner.go:130] >             "BinaryName": "",
	I1202 19:07:23.153071   49088 command_runner.go:130] >             "CriuImagePath": "",
	I1202 19:07:23.153079   49088 command_runner.go:130] >             "CriuWorkPath": "",
	I1202 19:07:23.153083   49088 command_runner.go:130] >             "IoGid": 0,
	I1202 19:07:23.153091   49088 command_runner.go:130] >             "IoUid": 0,
	I1202 19:07:23.153096   49088 command_runner.go:130] >             "NoNewKeyring": false,
	I1202 19:07:23.153100   49088 command_runner.go:130] >             "Root": "",
	I1202 19:07:23.153104   49088 command_runner.go:130] >             "ShimCgroup": "",
	I1202 19:07:23.153111   49088 command_runner.go:130] >             "SystemdCgroup": false
	I1202 19:07:23.153115   49088 command_runner.go:130] >           },
	I1202 19:07:23.153120   49088 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1202 19:07:23.153128   49088 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1202 19:07:23.153136   49088 command_runner.go:130] >           "runtimePath": "",
	I1202 19:07:23.153143   49088 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1202 19:07:23.153237   49088 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1202 19:07:23.153375   49088 command_runner.go:130] >           "snapshotter": ""
	I1202 19:07:23.153385   49088 command_runner.go:130] >         }
	I1202 19:07:23.153389   49088 command_runner.go:130] >       }
	I1202 19:07:23.153393   49088 command_runner.go:130] >     },
	I1202 19:07:23.153414   49088 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1202 19:07:23.153424   49088 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1202 19:07:23.153435   49088 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1202 19:07:23.153444   49088 command_runner.go:130] >     "disableApparmor": false,
	I1202 19:07:23.153449   49088 command_runner.go:130] >     "disableHugetlbController": true,
	I1202 19:07:23.153457   49088 command_runner.go:130] >     "disableProcMount": false,
	I1202 19:07:23.153467   49088 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1202 19:07:23.153475   49088 command_runner.go:130] >     "enableCDI": true,
	I1202 19:07:23.153479   49088 command_runner.go:130] >     "enableSelinux": false,
	I1202 19:07:23.153484   49088 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1202 19:07:23.153490   49088 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1202 19:07:23.153500   49088 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1202 19:07:23.153508   49088 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1202 19:07:23.153516   49088 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1202 19:07:23.153522   49088 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1202 19:07:23.153534   49088 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1202 19:07:23.153544   49088 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1202 19:07:23.153549   49088 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1202 19:07:23.153562   49088 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1202 19:07:23.153570   49088 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1202 19:07:23.153575   49088 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1202 19:07:23.153578   49088 command_runner.go:130] >   },
	I1202 19:07:23.153582   49088 command_runner.go:130] >   "features": {
	I1202 19:07:23.153588   49088 command_runner.go:130] >     "supplemental_groups_policy": true
	I1202 19:07:23.153597   49088 command_runner.go:130] >   },
	I1202 19:07:23.153605   49088 command_runner.go:130] >   "golang": "go1.24.9",
	I1202 19:07:23.153615   49088 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1202 19:07:23.153633   49088 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1202 19:07:23.153644   49088 command_runner.go:130] >   "runtimeHandlers": [
	I1202 19:07:23.153649   49088 command_runner.go:130] >     {
	I1202 19:07:23.153658   49088 command_runner.go:130] >       "features": {
	I1202 19:07:23.153664   49088 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1202 19:07:23.153669   49088 command_runner.go:130] >         "user_namespaces": true
	I1202 19:07:23.153675   49088 command_runner.go:130] >       }
	I1202 19:07:23.153679   49088 command_runner.go:130] >     },
	I1202 19:07:23.153686   49088 command_runner.go:130] >     {
	I1202 19:07:23.153691   49088 command_runner.go:130] >       "features": {
	I1202 19:07:23.153703   49088 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1202 19:07:23.153708   49088 command_runner.go:130] >         "user_namespaces": true
	I1202 19:07:23.153715   49088 command_runner.go:130] >       },
	I1202 19:07:23.153720   49088 command_runner.go:130] >       "name": "runc"
	I1202 19:07:23.153727   49088 command_runner.go:130] >     }
	I1202 19:07:23.153731   49088 command_runner.go:130] >   ],
	I1202 19:07:23.153738   49088 command_runner.go:130] >   "status": {
	I1202 19:07:23.153742   49088 command_runner.go:130] >     "conditions": [
	I1202 19:07:23.153746   49088 command_runner.go:130] >       {
	I1202 19:07:23.153751   49088 command_runner.go:130] >         "message": "",
	I1202 19:07:23.153757   49088 command_runner.go:130] >         "reason": "",
	I1202 19:07:23.153766   49088 command_runner.go:130] >         "status": true,
	I1202 19:07:23.153774   49088 command_runner.go:130] >         "type": "RuntimeReady"
	I1202 19:07:23.153781   49088 command_runner.go:130] >       },
	I1202 19:07:23.153785   49088 command_runner.go:130] >       {
	I1202 19:07:23.153792   49088 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1202 19:07:23.153797   49088 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1202 19:07:23.153805   49088 command_runner.go:130] >         "status": false,
	I1202 19:07:23.153810   49088 command_runner.go:130] >         "type": "NetworkReady"
	I1202 19:07:23.153814   49088 command_runner.go:130] >       },
	I1202 19:07:23.153820   49088 command_runner.go:130] >       {
	I1202 19:07:23.153824   49088 command_runner.go:130] >         "message": "",
	I1202 19:07:23.153828   49088 command_runner.go:130] >         "reason": "",
	I1202 19:07:23.153836   49088 command_runner.go:130] >         "status": true,
	I1202 19:07:23.153850   49088 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1202 19:07:23.153857   49088 command_runner.go:130] >       }
	I1202 19:07:23.153861   49088 command_runner.go:130] >     ]
	I1202 19:07:23.153868   49088 command_runner.go:130] >   }
	I1202 19:07:23.153871   49088 command_runner.go:130] > }
	I1202 19:07:23.157283   49088 cni.go:84] Creating CNI manager for ""
	I1202 19:07:23.157307   49088 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:07:23.157324   49088 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 19:07:23.157352   49088 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-449836 NodeName:functional-449836 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 19:07:23.157503   49088 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-449836"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 19:07:23.157589   49088 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 19:07:23.165274   49088 command_runner.go:130] > kubeadm
	I1202 19:07:23.165296   49088 command_runner.go:130] > kubectl
	I1202 19:07:23.165301   49088 command_runner.go:130] > kubelet
	I1202 19:07:23.166244   49088 binaries.go:51] Found k8s binaries, skipping transfer
	I1202 19:07:23.166309   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 19:07:23.176520   49088 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1202 19:07:23.191534   49088 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 19:07:23.207596   49088 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1202 19:07:23.221899   49088 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1202 19:07:23.225538   49088 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1202 19:07:23.225972   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:23.344071   49088 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:07:24.171449   49088 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836 for IP: 192.168.49.2
	I1202 19:07:24.171473   49088 certs.go:195] generating shared ca certs ...
	I1202 19:07:24.171491   49088 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.171633   49088 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 19:07:24.171683   49088 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 19:07:24.171697   49088 certs.go:257] generating profile certs ...
	I1202 19:07:24.171794   49088 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key
	I1202 19:07:24.171860   49088 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da
	I1202 19:07:24.171905   49088 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key
	I1202 19:07:24.171916   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1202 19:07:24.171929   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1202 19:07:24.171946   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1202 19:07:24.171957   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1202 19:07:24.171972   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1202 19:07:24.171985   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1202 19:07:24.172001   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1202 19:07:24.172012   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1202 19:07:24.172062   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 19:07:24.172113   49088 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 19:07:24.172126   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 19:07:24.172154   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 19:07:24.172189   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 19:07:24.172215   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 19:07:24.172266   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:07:24.172298   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.172314   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.172347   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem -> /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.172878   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 19:07:24.192840   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 19:07:24.210709   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 19:07:24.228270   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 19:07:24.246519   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1202 19:07:24.264649   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 19:07:24.283289   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 19:07:24.302316   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 19:07:24.320907   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 19:07:24.338895   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 19:07:24.356995   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 19:07:24.374784   49088 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 19:07:24.388173   49088 ssh_runner.go:195] Run: openssl version
	I1202 19:07:24.394457   49088 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1202 19:07:24.394840   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 19:07:24.403512   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407229   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407385   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407455   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.448501   49088 command_runner.go:130] > 3ec20f2e
	I1202 19:07:24.448942   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 19:07:24.456981   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 19:07:24.465478   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469306   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469374   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469438   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.510270   49088 command_runner.go:130] > b5213941
	I1202 19:07:24.510784   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 19:07:24.518790   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 19:07:24.527001   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.530919   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.530959   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.531008   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.571727   49088 command_runner.go:130] > 51391683
	I1202 19:07:24.572161   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 19:07:24.580157   49088 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:07:24.584062   49088 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:07:24.584087   49088 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1202 19:07:24.584094   49088 command_runner.go:130] > Device: 259,1	Inode: 848916      Links: 1
	I1202 19:07:24.584101   49088 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1202 19:07:24.584108   49088 command_runner.go:130] > Access: 2025-12-02 19:03:16.577964732 +0000
	I1202 19:07:24.584114   49088 command_runner.go:130] > Modify: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584119   49088 command_runner.go:130] > Change: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584125   49088 command_runner.go:130] >  Birth: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584207   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1202 19:07:24.630311   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.630810   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1202 19:07:24.671995   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.672412   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1202 19:07:24.713648   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.713758   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1202 19:07:24.754977   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.755077   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1202 19:07:24.800995   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.801486   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1202 19:07:24.844718   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.845325   49088 kubeadm.go:401] StartCluster: {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:24.845410   49088 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 19:07:24.845499   49088 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:07:24.875465   49088 cri.go:89] found id: ""
	I1202 19:07:24.875565   49088 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 19:07:24.882887   49088 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1202 19:07:24.882908   49088 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1202 19:07:24.882928   49088 command_runner.go:130] > /var/lib/minikube/etcd:
	I1202 19:07:24.883961   49088 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1202 19:07:24.884012   49088 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1202 19:07:24.884084   49088 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1202 19:07:24.891632   49088 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:07:24.892026   49088 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-449836" does not appear in /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.892129   49088 kubeconfig.go:62] /home/jenkins/minikube-integration/22021-2487/kubeconfig needs updating (will repair): [kubeconfig missing "functional-449836" cluster setting kubeconfig missing "functional-449836" context setting]
	I1202 19:07:24.892546   49088 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.892988   49088 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.893140   49088 kapi.go:59] client config for functional-449836: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 19:07:24.893652   49088 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1202 19:07:24.893721   49088 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1202 19:07:24.893742   49088 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1202 19:07:24.893817   49088 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1202 19:07:24.893840   49088 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1202 19:07:24.893879   49088 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1202 19:07:24.894204   49088 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1202 19:07:24.902267   49088 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1202 19:07:24.902298   49088 kubeadm.go:602] duration metric: took 18.265587ms to restartPrimaryControlPlane
	I1202 19:07:24.902309   49088 kubeadm.go:403] duration metric: took 56.993765ms to StartCluster
	I1202 19:07:24.902355   49088 settings.go:142] acquiring lock: {Name:mka76ea0dcf16fdbb68808885f8360c0083029b4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.902437   49088 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.903036   49088 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.903251   49088 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1202 19:07:24.903573   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:24.903617   49088 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1202 19:07:24.903676   49088 addons.go:70] Setting storage-provisioner=true in profile "functional-449836"
	I1202 19:07:24.903691   49088 addons.go:239] Setting addon storage-provisioner=true in "functional-449836"
	I1202 19:07:24.903717   49088 host.go:66] Checking if "functional-449836" exists ...
	I1202 19:07:24.903830   49088 addons.go:70] Setting default-storageclass=true in profile "functional-449836"
	I1202 19:07:24.903877   49088 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-449836"
	I1202 19:07:24.904207   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.904250   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.909664   49088 out.go:179] * Verifying Kubernetes components...
	I1202 19:07:24.912752   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:24.942660   49088 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 19:07:24.943205   49088 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.943381   49088 kapi.go:59] client config for functional-449836: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 19:07:24.943666   49088 addons.go:239] Setting addon default-storageclass=true in "functional-449836"
	I1202 19:07:24.943695   49088 host.go:66] Checking if "functional-449836" exists ...
	I1202 19:07:24.944105   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.945588   49088 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:24.945617   49088 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1202 19:07:24.945676   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:24.976744   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:24.983018   49088 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:24.983040   49088 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1202 19:07:24.983109   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:25.013238   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:25.139303   49088 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:07:25.147308   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:25.166870   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:25.922715   49088 node_ready.go:35] waiting up to 6m0s for node "functional-449836" to be "Ready" ...
	I1202 19:07:25.922842   49088 type.go:168] "Request Body" body=""
	I1202 19:07:25.922904   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:25.923137   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:25.923161   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923181   49088 retry.go:31] will retry after 314.802872ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923212   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:25.923227   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923235   49088 retry.go:31] will retry after 316.161686ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923297   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:26.238968   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:26.239458   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:26.312262   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.312301   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.312346   49088 retry.go:31] will retry after 358.686092ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.320393   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.320484   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.320525   49088 retry.go:31] will retry after 528.121505ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.423804   49088 type.go:168] "Request Body" body=""
	I1202 19:07:26.423895   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:26.424214   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:26.671815   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:26.745439   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.745497   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.745515   49088 retry.go:31] will retry after 446.477413ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.849789   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:26.909069   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.909108   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.909134   49088 retry.go:31] will retry after 684.877567ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.923341   49088 type.go:168] "Request Body" body=""
	I1202 19:07:26.923433   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:26.923791   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:27.192236   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:27.247207   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:27.250502   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.250546   49088 retry.go:31] will retry after 797.707708ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.423790   49088 type.go:168] "Request Body" body=""
	I1202 19:07:27.423889   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:27.424196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:27.594774   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:27.660877   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:27.660957   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.660987   49088 retry.go:31] will retry after 601.48037ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.923401   49088 type.go:168] "Request Body" body=""
	I1202 19:07:27.923475   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:27.923784   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:27.923848   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:28.049160   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:28.112455   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:28.112493   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.112512   49088 retry.go:31] will retry after 941.564206ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.262919   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:28.323250   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:28.323307   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.323325   49088 retry.go:31] will retry after 741.834409ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.423555   49088 type.go:168] "Request Body" body=""
	I1202 19:07:28.423652   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:28.423971   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:28.923658   49088 type.go:168] "Request Body" body=""
	I1202 19:07:28.923731   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:28.923989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:29.054311   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:29.065740   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:29.126744   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:29.126791   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.126812   49088 retry.go:31] will retry after 2.378740888s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.143543   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:29.143609   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.143631   49088 retry.go:31] will retry after 2.739062704s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:07:29.423043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:29.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:29.923116   49088 type.go:168] "Request Body" body=""
	I1202 19:07:29.923203   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:29.923554   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:30.422925   49088 type.go:168] "Request Body" body=""
	I1202 19:07:30.423004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:30.423294   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:30.423351   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:30.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:07:30.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:30.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:07:31.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:31.423376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.506668   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:31.565098   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:31.565149   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.565168   49088 retry.go:31] will retry after 3.30231188s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.883619   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:31.923118   49088 type.go:168] "Request Body" body=""
	I1202 19:07:31.923190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:31.923483   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.949881   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:31.953682   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.953716   49088 retry.go:31] will retry after 2.323480137s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:32.422997   49088 type.go:168] "Request Body" body=""
	I1202 19:07:32.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:32.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:32.423441   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:32.923115   49088 type.go:168] "Request Body" body=""
	I1202 19:07:32.923193   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:32.923525   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:33.422891   49088 type.go:168] "Request Body" body=""
	I1202 19:07:33.422956   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:33.423209   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:33.922911   49088 type.go:168] "Request Body" body=""
	I1202 19:07:33.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:33.923302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:34.277557   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:34.337253   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:34.337306   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.337326   49088 retry.go:31] will retry after 5.941517157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.423656   49088 type.go:168] "Request Body" body=""
	I1202 19:07:34.423738   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:34.424084   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:34.424136   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:34.867735   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:34.923406   49088 type.go:168] "Request Body" body=""
	I1202 19:07:34.923506   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:34.923762   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:34.931582   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:34.931622   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.931641   49088 retry.go:31] will retry after 5.732328972s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:35.423000   49088 type.go:168] "Request Body" body=""
	I1202 19:07:35.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:35.423385   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:35.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:07:35.923063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:35.923444   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:36.422994   49088 type.go:168] "Request Body" body=""
	I1202 19:07:36.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:36.423390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:36.922999   49088 type.go:168] "Request Body" body=""
	I1202 19:07:36.923077   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:36.923397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:36.923453   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:37.423120   49088 type.go:168] "Request Body" body=""
	I1202 19:07:37.423196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:37.423525   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:37.923076   49088 type.go:168] "Request Body" body=""
	I1202 19:07:37.923148   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:37.923450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:38.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:07:38.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:38.423432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:38.923000   49088 type.go:168] "Request Body" body=""
	I1202 19:07:38.923074   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:38.923410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:39.423757   49088 type.go:168] "Request Body" body=""
	I1202 19:07:39.423827   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:39.424076   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:39.424115   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:39.923939   49088 type.go:168] "Request Body" body=""
	I1202 19:07:39.924013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:39.924357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:40.279081   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:40.340610   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:40.340655   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.340674   49088 retry.go:31] will retry after 7.832295728s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:07:40.423959   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:40.424241   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:40.664676   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:40.720825   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:40.724043   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.724077   49088 retry.go:31] will retry after 3.410570548s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.923400   49088 type.go:168] "Request Body" body=""
	I1202 19:07:40.923497   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:40.923882   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:41.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:07:41.423784   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:41.424115   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:41.424172   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:41.922915   49088 type.go:168] "Request Body" body=""
	I1202 19:07:41.922990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:41.923344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:42.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:07:42.422980   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:42.423254   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:42.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:07:42.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:42.923341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:43.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:07:43.423067   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:43.423429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:43.923715   49088 type.go:168] "Request Body" body=""
	I1202 19:07:43.923780   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:43.924087   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:43.924145   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:44.135480   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:44.194407   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:44.194462   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:44.194482   49088 retry.go:31] will retry after 9.43511002s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:44.423808   49088 type.go:168] "Request Body" body=""
	I1202 19:07:44.423884   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:44.424207   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:44.923173   49088 type.go:168] "Request Body" body=""
	I1202 19:07:44.923287   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:44.923608   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:45.423511   49088 type.go:168] "Request Body" body=""
	I1202 19:07:45.423594   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:45.423852   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:45.923682   49088 type.go:168] "Request Body" body=""
	I1202 19:07:45.923754   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:45.924062   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:46.423867   49088 type.go:168] "Request Body" body=""
	I1202 19:07:46.423945   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:46.424267   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:46.424344   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:46.923022   49088 type.go:168] "Request Body" body=""
	I1202 19:07:46.923087   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:46.923335   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:47.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:07:47.423049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:47.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:47.922938   49088 type.go:168] "Request Body" body=""
	I1202 19:07:47.923018   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:47.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:48.173817   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:48.233696   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:48.233741   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:48.233760   49088 retry.go:31] will retry after 11.915058211s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:48.423860   49088 type.go:168] "Request Body" body=""
	I1202 19:07:48.423931   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:48.424192   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:48.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:07:48.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:48.923338   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:48.923389   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:49.423071   49088 type.go:168] "Request Body" body=""
	I1202 19:07:49.423160   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:49.423457   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:49.923767   49088 type.go:168] "Request Body" body=""
	I1202 19:07:49.923839   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:49.924094   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:50.423628   49088 type.go:168] "Request Body" body=""
	I1202 19:07:50.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:50.424008   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:50.923823   49088 type.go:168] "Request Body" body=""
	I1202 19:07:50.923896   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:50.924199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:50.924253   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:51.423846   49088 type.go:168] "Request Body" body=""
	I1202 19:07:51.423915   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:51.424234   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:51.922982   49088 type.go:168] "Request Body" body=""
	I1202 19:07:51.923118   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:51.923450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:52.423137   49088 type.go:168] "Request Body" body=""
	I1202 19:07:52.423209   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:52.423568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:52.923553   49088 type.go:168] "Request Body" body=""
	I1202 19:07:52.923629   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:52.923890   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:53.423671   49088 type.go:168] "Request Body" body=""
	I1202 19:07:53.423751   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:53.424089   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:53.424151   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:53.630602   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:53.701195   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:53.708777   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:53.708825   49088 retry.go:31] will retry after 18.228322251s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:53.923261   49088 type.go:168] "Request Body" body=""
	I1202 19:07:53.923336   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:53.923674   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:54.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:07:54.422976   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:54.423235   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:54.923162   49088 type.go:168] "Request Body" body=""
	I1202 19:07:54.923249   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:54.923575   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:55.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:07:55.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:55.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:55.922912   49088 type.go:168] "Request Body" body=""
	I1202 19:07:55.922984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:55.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:55.923346   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:56.422984   49088 type.go:168] "Request Body" body=""
	I1202 19:07:56.423058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:56.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:56.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:07:56.923062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:56.923429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:57.423124   49088 type.go:168] "Request Body" body=""
	I1202 19:07:57.423196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:57.423456   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:57.922920   49088 type.go:168] "Request Body" body=""
	I1202 19:07:57.922991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:57.923317   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:57.923373   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:58.422950   49088 type.go:168] "Request Body" body=""
	I1202 19:07:58.423033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:58.423366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:58.923630   49088 type.go:168] "Request Body" body=""
	I1202 19:07:58.923696   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:58.924020   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:59.423807   49088 type.go:168] "Request Body" body=""
	I1202 19:07:59.423887   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:59.424243   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:59.922942   49088 type.go:168] "Request Body" body=""
	I1202 19:07:59.923022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:59.923353   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:59.923410   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:00.150075   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:00.323059   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:00.323111   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:00.323132   49088 retry.go:31] will retry after 12.256345503s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:00.423512   49088 type.go:168] "Request Body" body=""
	I1202 19:08:00.423597   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:00.423977   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:00.923784   49088 type.go:168] "Request Body" body=""
	I1202 19:08:00.923865   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:00.924196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:01.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:08:01.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:01.423304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:01.922933   49088 type.go:168] "Request Body" body=""
	I1202 19:08:01.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:01.923287   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:02.422978   49088 type.go:168] "Request Body" body=""
	I1202 19:08:02.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:02.423379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:02.423436   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:02.923122   49088 type.go:168] "Request Body" body=""
	I1202 19:08:02.923245   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:02.923555   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:03.423814   49088 type.go:168] "Request Body" body=""
	I1202 19:08:03.423889   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:03.424141   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:03.923895   49088 type.go:168] "Request Body" body=""
	I1202 19:08:03.923996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:03.924288   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:04.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:08:04.423083   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:04.423375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:04.922988   49088 type.go:168] "Request Body" body=""
	I1202 19:08:04.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:04.923336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:04.923376   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:05.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:08:05.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:05.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:05.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:08:05.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:05.923359   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:06.423783   49088 type.go:168] "Request Body" body=""
	I1202 19:08:06.423854   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:06.424112   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:06.923877   49088 type.go:168] "Request Body" body=""
	I1202 19:08:06.923974   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:06.924303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:06.924381   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:07.423044   49088 type.go:168] "Request Body" body=""
	I1202 19:08:07.423125   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:07.423474   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:07.922856   49088 type.go:168] "Request Body" body=""
	I1202 19:08:07.922930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:07.923205   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:08.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:08.422992   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:08.423315   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:08.922886   49088 type.go:168] "Request Body" body=""
	I1202 19:08:08.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:08.923313   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:09.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:09.423006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:09.423295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:09.423343   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:09.922894   49088 type.go:168] "Request Body" body=""
	I1202 19:08:09.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:09.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:10.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:08:10.423153   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:10.423491   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:10.923741   49088 type.go:168] "Request Body" body=""
	I1202 19:08:10.923814   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:10.924073   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:11.423834   49088 type.go:168] "Request Body" body=""
	I1202 19:08:11.423907   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:11.424248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:11.424304   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:11.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:08:11.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:11.923342   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:11.937687   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:08:11.996748   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:11.999800   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:11.999828   49088 retry.go:31] will retry after 12.016513449s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.423502   49088 type.go:168] "Request Body" body=""
	I1202 19:08:12.423582   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:12.423831   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:12.580354   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:12.637408   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:12.637456   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.637477   49088 retry.go:31] will retry after 30.215930355s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.923948   49088 type.go:168] "Request Body" body=""
	I1202 19:08:12.924043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:12.924384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:13.422995   49088 type.go:168] "Request Body" body=""
	I1202 19:08:13.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:13.423402   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:13.923854   49088 type.go:168] "Request Body" body=""
	I1202 19:08:13.923924   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:13.924172   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:13.924221   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:14.422931   49088 type.go:168] "Request Body" body=""
	I1202 19:08:14.423013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:14.423360   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:14.923106   49088 type.go:168] "Request Body" body=""
	I1202 19:08:14.923201   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:14.923504   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:15.423455   49088 type.go:168] "Request Body" body=""
	I1202 19:08:15.423543   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:15.423801   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:15.923582   49088 type.go:168] "Request Body" body=""
	I1202 19:08:15.923658   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:15.923982   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:16.423696   49088 type.go:168] "Request Body" body=""
	I1202 19:08:16.423768   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:16.424069   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:16.424123   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:16.923441   49088 type.go:168] "Request Body" body=""
	I1202 19:08:16.923513   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:16.923823   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:17.423623   49088 type.go:168] "Request Body" body=""
	I1202 19:08:17.423715   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:17.424059   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:17.923916   49088 type.go:168] "Request Body" body=""
	I1202 19:08:17.923987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:17.924303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:18.422927   49088 type.go:168] "Request Body" body=""
	I1202 19:08:18.423013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:18.423293   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:18.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:08:18.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:18.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:18.923511   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:19.423204   49088 type.go:168] "Request Body" body=""
	I1202 19:08:19.423280   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:19.423633   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:19.923322   49088 type.go:168] "Request Body" body=""
	I1202 19:08:19.923392   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:19.923647   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:20.423776   49088 type.go:168] "Request Body" body=""
	I1202 19:08:20.423870   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:20.424201   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:20.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:08:20.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:20.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:21.423693   49088 type.go:168] "Request Body" body=""
	I1202 19:08:21.423801   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:21.424068   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:21.424117   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:21.923863   49088 type.go:168] "Request Body" body=""
	I1202 19:08:21.923935   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:21.924262   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:22.422993   49088 type.go:168] "Request Body" body=""
	I1202 19:08:22.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:22.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:22.922927   49088 type.go:168] "Request Body" body=""
	I1202 19:08:22.922998   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:22.923323   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:23.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:08:23.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:23.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:23.922971   49088 type.go:168] "Request Body" body=""
	I1202 19:08:23.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:23.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:23.923391   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:24.016567   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:08:24.078750   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:24.078790   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:24.078809   49088 retry.go:31] will retry after 37.473532818s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:24.423149   49088 type.go:168] "Request Body" body=""
	I1202 19:08:24.423225   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:24.423606   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:24.923585   49088 type.go:168] "Request Body" body=""
	I1202 19:08:24.923686   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:24.924015   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:25.423855   49088 type.go:168] "Request Body" body=""
	I1202 19:08:25.423933   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:25.424248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:25.923542   49088 type.go:168] "Request Body" body=""
	I1202 19:08:25.923615   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:25.923871   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:25.923923   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:26.423702   49088 type.go:168] "Request Body" body=""
	I1202 19:08:26.423799   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:26.424100   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:26.923908   49088 type.go:168] "Request Body" body=""
	I1202 19:08:26.923990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:26.924357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:27.422916   49088 type.go:168] "Request Body" body=""
	I1202 19:08:27.423000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:27.423295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:27.922995   49088 type.go:168] "Request Body" body=""
	I1202 19:08:27.923085   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:27.923386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:28.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:08:28.423198   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:28.423550   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:28.423605   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:28.923207   49088 type.go:168] "Request Body" body=""
	I1202 19:08:28.923280   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:28.923547   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:29.423229   49088 type.go:168] "Request Body" body=""
	I1202 19:08:29.423310   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:29.423621   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:29.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:08:29.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:29.923334   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:30.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:08:30.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:30.423290   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:30.922992   49088 type.go:168] "Request Body" body=""
	I1202 19:08:30.923070   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:30.923374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:30.923423   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:31.422962   49088 type.go:168] "Request Body" body=""
	I1202 19:08:31.423044   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:31.423370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:31.923658   49088 type.go:168] "Request Body" body=""
	I1202 19:08:31.923727   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:31.923984   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:32.423783   49088 type.go:168] "Request Body" body=""
	I1202 19:08:32.423853   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:32.424194   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:32.923864   49088 type.go:168] "Request Body" body=""
	I1202 19:08:32.923952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:32.924274   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:32.924361   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:33.422870   49088 type.go:168] "Request Body" body=""
	I1202 19:08:33.422935   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:33.423233   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:33.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:08:33.923021   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:33.923340   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:34.423053   49088 type.go:168] "Request Body" body=""
	I1202 19:08:34.423137   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:34.423440   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:34.923286   49088 type.go:168] "Request Body" body=""
	I1202 19:08:34.923353   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:34.923610   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:35.423738   49088 type.go:168] "Request Body" body=""
	I1202 19:08:35.423811   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:35.424122   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:35.424178   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:35.923982   49088 type.go:168] "Request Body" body=""
	I1202 19:08:35.924054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:35.924397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:36.423490   49088 type.go:168] "Request Body" body=""
	I1202 19:08:36.423577   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:36.423904   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:36.923609   49088 type.go:168] "Request Body" body=""
	I1202 19:08:36.923698   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:36.924003   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:37.423836   49088 type.go:168] "Request Body" body=""
	I1202 19:08:37.423908   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:37.424273   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:37.424347   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:37.923878   49088 type.go:168] "Request Body" body=""
	I1202 19:08:37.923949   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:37.924222   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:38.422922   49088 type.go:168] "Request Body" body=""
	I1202 19:08:38.422995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:38.423329   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:38.923060   49088 type.go:168] "Request Body" body=""
	I1202 19:08:38.923133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:38.923451   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:39.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:08:39.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:39.423240   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:39.922980   49088 type.go:168] "Request Body" body=""
	I1202 19:08:39.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:39.923354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:39.923403   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:40.423331   49088 type.go:168] "Request Body" body=""
	I1202 19:08:40.423423   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:40.423754   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:40.923541   49088 type.go:168] "Request Body" body=""
	I1202 19:08:40.923652   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:40.923952   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:41.423758   49088 type.go:168] "Request Body" body=""
	I1202 19:08:41.423829   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:41.424159   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:41.922904   49088 type.go:168] "Request Body" body=""
	I1202 19:08:41.922987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:41.923363   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:42.423568   49088 type.go:168] "Request Body" body=""
	I1202 19:08:42.423637   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:42.423879   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:42.423921   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:42.854609   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:42.913285   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:42.916268   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:42.916300   49088 retry.go:31] will retry after 24.794449401s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:42.923470   49088 type.go:168] "Request Body" body=""
	I1202 19:08:42.923553   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:42.923860   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:43.423622   49088 type.go:168] "Request Body" body=""
	I1202 19:08:43.423694   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:43.423983   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:43.923751   49088 type.go:168] "Request Body" body=""
	I1202 19:08:43.923834   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:43.924123   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:44.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:44.423014   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:44.423327   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:44.923006   49088 type.go:168] "Request Body" body=""
	I1202 19:08:44.923080   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:44.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:44.923476   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:45.422929   49088 type.go:168] "Request Body" body=""
	I1202 19:08:45.423000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:45.423274   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:45.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:08:45.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:45.923364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:46.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:08:46.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:46.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:46.922941   49088 type.go:168] "Request Body" body=""
	I1202 19:08:46.923013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:46.923277   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:47.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:08:47.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:47.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:47.423440   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:47.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:08:47.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:47.923383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:48.423521   49088 type.go:168] "Request Body" body=""
	I1202 19:08:48.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:48.424010   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:48.923782   49088 type.go:168] "Request Body" body=""
	I1202 19:08:48.923856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:48.924186   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:49.422919   49088 type.go:168] "Request Body" body=""
	I1202 19:08:49.422992   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:49.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:49.922931   49088 type.go:168] "Request Body" body=""
	I1202 19:08:49.923004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:49.923306   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:49.923362   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:50.423001   49088 type.go:168] "Request Body" body=""
	I1202 19:08:50.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:50.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:50.922993   49088 type.go:168] "Request Body" body=""
	I1202 19:08:50.923072   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:50.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:51.423082   49088 type.go:168] "Request Body" body=""
	I1202 19:08:51.423174   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:51.423497   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:51.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:08:51.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:51.923335   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:51.923382   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:52.423062   49088 type.go:168] "Request Body" body=""
	I1202 19:08:52.423141   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:52.423469   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:52.922906   49088 type.go:168] "Request Body" body=""
	I1202 19:08:52.922982   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:52.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:53.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:08:53.423074   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:53.423405   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:53.923177   49088 type.go:168] "Request Body" body=""
	I1202 19:08:53.923253   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:53.923577   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:53.923645   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:54.422880   49088 type.go:168] "Request Body" body=""
	I1202 19:08:54.422958   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:54.423220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:54.923069   49088 type.go:168] "Request Body" body=""
	I1202 19:08:54.923142   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:54.923466   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:55.423373   49088 type.go:168] "Request Body" body=""
	I1202 19:08:55.423459   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:55.423806   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:55.923603   49088 type.go:168] "Request Body" body=""
	I1202 19:08:55.923681   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:55.923944   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:55.923992   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:56.423750   49088 type.go:168] "Request Body" body=""
	I1202 19:08:56.423821   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:56.424196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:56.922939   49088 type.go:168] "Request Body" body=""
	I1202 19:08:56.923015   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:56.923399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:57.423718   49088 type.go:168] "Request Body" body=""
	I1202 19:08:57.423789   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:57.424085   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:57.923903   49088 type.go:168] "Request Body" body=""
	I1202 19:08:57.923980   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:57.924302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:57.924374   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:58.422958   49088 type.go:168] "Request Body" body=""
	I1202 19:08:58.423030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:58.423367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:58.923777   49088 type.go:168] "Request Body" body=""
	I1202 19:08:58.923851   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:58.924127   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:59.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:08:59.423956   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:59.424305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:59.922904   49088 type.go:168] "Request Body" body=""
	I1202 19:08:59.922978   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:59.923298   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:00.423245   49088 type.go:168] "Request Body" body=""
	I1202 19:09:00.423318   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:00.423619   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:00.423665   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:00.922984   49088 type.go:168] "Request Body" body=""
	I1202 19:09:00.923063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:00.923384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:01.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:09:01.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:01.423390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:01.552630   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:09:01.616821   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:01.616872   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:01.616967   49088 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1202 19:09:01.923268   49088 type.go:168] "Request Body" body=""
	I1202 19:09:01.923333   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:01.923595   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:02.423036   49088 type.go:168] "Request Body" body=""
	I1202 19:09:02.423106   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:02.423425   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:02.922957   49088 type.go:168] "Request Body" body=""
	I1202 19:09:02.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:02.923349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:02.923411   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:03.422870   49088 type.go:168] "Request Body" body=""
	I1202 19:09:03.422937   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:03.423202   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:03.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:03.923048   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:03.923382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:04.422966   49088 type.go:168] "Request Body" body=""
	I1202 19:09:04.423045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:04.423360   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:04.923022   49088 type.go:168] "Request Body" body=""
	I1202 19:09:04.923097   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:04.923357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:05.423319   49088 type.go:168] "Request Body" body=""
	I1202 19:09:05.423392   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:05.423740   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:05.423793   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:05.923326   49088 type.go:168] "Request Body" body=""
	I1202 19:09:05.923409   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:05.923718   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:06.423454   49088 type.go:168] "Request Body" body=""
	I1202 19:09:06.423525   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:06.423826   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:06.923640   49088 type.go:168] "Request Body" body=""
	I1202 19:09:06.923716   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:06.924092   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:07.423755   49088 type.go:168] "Request Body" body=""
	I1202 19:09:07.423833   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:07.424174   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:07.424240   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:07.711667   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:09:07.768083   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:07.771273   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:07.771371   49088 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1202 19:09:07.774489   49088 out.go:179] * Enabled addons: 
	I1202 19:09:07.778178   49088 addons.go:530] duration metric: took 1m42.874553995s for enable addons: enabled=[]
	I1202 19:09:07.923592   49088 type.go:168] "Request Body" body=""
	I1202 19:09:07.923663   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:07.923975   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:08.423753   49088 type.go:168] "Request Body" body=""
	I1202 19:09:08.423867   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:08.424222   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:08.922931   49088 type.go:168] "Request Body" body=""
	I1202 19:09:08.923003   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:08.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:09.423790   49088 type.go:168] "Request Body" body=""
	I1202 19:09:09.423880   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:09.424153   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:09.922907   49088 type.go:168] "Request Body" body=""
	I1202 19:09:09.923001   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:09.923324   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:09.923374   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:10.423145   49088 type.go:168] "Request Body" body=""
	I1202 19:09:10.423260   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:10.423579   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:10.922932   49088 type.go:168] "Request Body" body=""
	I1202 19:09:10.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:10.923400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:11.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:09:11.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:11.423397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:11.923082   49088 type.go:168] "Request Body" body=""
	I1202 19:09:11.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:11.923464   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:11.923521   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:12.422896   49088 type.go:168] "Request Body" body=""
	I1202 19:09:12.422975   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:12.423250   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:12.922964   49088 type.go:168] "Request Body" body=""
	I1202 19:09:12.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:12.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:13.423093   49088 type.go:168] "Request Body" body=""
	I1202 19:09:13.423175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:13.423500   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:13.923207   49088 type.go:168] "Request Body" body=""
	I1202 19:09:13.923272   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:13.923535   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:13.923574   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:14.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:14.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:14.423377   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:14.923293   49088 type.go:168] "Request Body" body=""
	I1202 19:09:14.923367   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:14.923688   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:15.423514   49088 type.go:168] "Request Body" body=""
	I1202 19:09:15.423584   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:15.423882   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:15.923633   49088 type.go:168] "Request Body" body=""
	I1202 19:09:15.923702   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:15.924013   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:15.924083   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:16.423892   49088 type.go:168] "Request Body" body=""
	I1202 19:09:16.423994   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:16.424346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:16.922929   49088 type.go:168] "Request Body" body=""
	I1202 19:09:16.922996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:16.923246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:17.422959   49088 type.go:168] "Request Body" body=""
	I1202 19:09:17.423030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:17.423344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:17.923012   49088 type.go:168] "Request Body" body=""
	I1202 19:09:17.923112   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:17.923445   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:18.423579   49088 type.go:168] "Request Body" body=""
	I1202 19:09:18.423646   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:18.423912   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:18.423954   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:18.923689   49088 type.go:168] "Request Body" body=""
	I1202 19:09:18.923816   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:18.924164   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:19.423865   49088 type.go:168] "Request Body" body=""
	I1202 19:09:19.423938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:19.424264   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:19.922971   49088 type.go:168] "Request Body" body=""
	I1202 19:09:19.923065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:19.928773   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1202 19:09:20.423719   49088 type.go:168] "Request Body" body=""
	I1202 19:09:20.423798   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:20.424108   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:20.424158   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:20.923797   49088 type.go:168] "Request Body" body=""
	I1202 19:09:20.923876   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:20.924234   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:21.423890   49088 type.go:168] "Request Body" body=""
	I1202 19:09:21.423990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:21.424292   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:21.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:09:21.923044   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:21.923382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:22.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:22.423043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:22.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:22.923067   49088 type.go:168] "Request Body" body=""
	I1202 19:09:22.923150   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:22.923449   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:22.923501   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:23.423230   49088 type.go:168] "Request Body" body=""
	I1202 19:09:23.423312   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:23.423745   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:23.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:23.923037   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:23.923364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:24.423610   49088 type.go:168] "Request Body" body=""
	I1202 19:09:24.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:24.423973   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:24.922875   49088 type.go:168] "Request Body" body=""
	I1202 19:09:24.922950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:24.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:25.422971   49088 type.go:168] "Request Body" body=""
	I1202 19:09:25.423045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:25.423397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:25.423453   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:25.923781   49088 type.go:168] "Request Body" body=""
	I1202 19:09:25.923849   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:25.924111   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:26.423856   49088 type.go:168] "Request Body" body=""
	I1202 19:09:26.423928   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:26.424242   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:26.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:26.923039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:26.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:27.423069   49088 type.go:168] "Request Body" body=""
	I1202 19:09:27.423144   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:27.423407   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:27.922946   49088 type.go:168] "Request Body" body=""
	I1202 19:09:27.923018   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:27.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:27.923411   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:28.423076   49088 type.go:168] "Request Body" body=""
	I1202 19:09:28.423152   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:28.423495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:28.923840   49088 type.go:168] "Request Body" body=""
	I1202 19:09:28.923913   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:28.924219   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:29.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:09:29.422989   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:29.423326   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:29.923042   49088 type.go:168] "Request Body" body=""
	I1202 19:09:29.923119   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:29.923480   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:29.923538   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:30.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:09:30.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:30.423371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:30.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:30.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:30.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:31.423083   49088 type.go:168] "Request Body" body=""
	I1202 19:09:31.423155   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:31.423507   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:31.922881   49088 type.go:168] "Request Body" body=""
	I1202 19:09:31.922954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:31.923312   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:32.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:09:32.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:32.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:32.423472   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:32.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:09:32.923050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:32.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:33.423100   49088 type.go:168] "Request Body" body=""
	I1202 19:09:33.423172   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:33.423473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:33.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:09:33.923039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:33.923379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:34.423096   49088 type.go:168] "Request Body" body=""
	I1202 19:09:34.423177   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:34.423484   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:34.423532   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:34.923381   49088 type.go:168] "Request Body" body=""
	I1202 19:09:34.923452   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:34.923763   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:35.423619   49088 type.go:168] "Request Body" body=""
	I1202 19:09:35.423698   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:35.424059   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:35.923863   49088 type.go:168] "Request Body" body=""
	I1202 19:09:35.923933   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:35.924297   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:36.423817   49088 type.go:168] "Request Body" body=""
	I1202 19:09:36.423883   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:36.424153   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:36.424193   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:36.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:09:36.922964   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:36.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:37.422984   49088 type.go:168] "Request Body" body=""
	I1202 19:09:37.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:37.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:37.922914   49088 type.go:168] "Request Body" body=""
	I1202 19:09:37.922987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:37.923253   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:38.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:09:38.423020   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:38.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:38.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:38.923084   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:38.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:38.923459   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:39.423118   49088 type.go:168] "Request Body" body=""
	I1202 19:09:39.423188   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:39.423443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:39.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:09:39.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:39.923390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:40.422957   49088 type.go:168] "Request Body" body=""
	I1202 19:09:40.423031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:40.423438   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:40.922909   49088 type.go:168] "Request Body" body=""
	I1202 19:09:40.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:40.923328   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:41.422954   49088 type.go:168] "Request Body" body=""
	I1202 19:09:41.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:41.423387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:41.423450   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:41.923108   49088 type.go:168] "Request Body" body=""
	I1202 19:09:41.923187   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:41.923536   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:42.423214   49088 type.go:168] "Request Body" body=""
	I1202 19:09:42.423293   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:42.423567   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:42.922993   49088 type.go:168] "Request Body" body=""
	I1202 19:09:42.923073   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:42.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:43.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:09:43.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:43.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:43.923097   49088 type.go:168] "Request Body" body=""
	I1202 19:09:43.923183   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:43.923554   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:43.923610   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:44.422960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:44.423031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:44.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:44.923133   49088 type.go:168] "Request Body" body=""
	I1202 19:09:44.923217   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:44.923568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:45.422996   49088 type.go:168] "Request Body" body=""
	I1202 19:09:45.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:45.423325   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:45.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:09:45.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:45.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:46.422965   49088 type.go:168] "Request Body" body=""
	I1202 19:09:46.423041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:46.423338   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:46.423383   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:46.923662   49088 type.go:168] "Request Body" body=""
	I1202 19:09:46.923729   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:46.923996   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:47.423794   49088 type.go:168] "Request Body" body=""
	I1202 19:09:47.423868   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:47.424199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:47.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:09:47.922970   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:47.923290   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:48.423862   49088 type.go:168] "Request Body" body=""
	I1202 19:09:48.423930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:48.424197   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:48.424242   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:48.922882   49088 type.go:168] "Request Body" body=""
	I1202 19:09:48.922964   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:48.923305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:49.422874   49088 type.go:168] "Request Body" body=""
	I1202 19:09:49.422952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:49.423501   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:49.923227   49088 type.go:168] "Request Body" body=""
	I1202 19:09:49.923298   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:49.923571   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:50.423530   49088 type.go:168] "Request Body" body=""
	I1202 19:09:50.423605   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:50.423930   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:50.923709   49088 type.go:168] "Request Body" body=""
	I1202 19:09:50.923791   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:50.924129   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:50.924183   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:51.423574   49088 type.go:168] "Request Body" body=""
	I1202 19:09:51.423645   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:51.423989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:51.923773   49088 type.go:168] "Request Body" body=""
	I1202 19:09:51.923846   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:51.924175   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:52.423792   49088 type.go:168] "Request Body" body=""
	I1202 19:09:52.423865   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:52.424194   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:52.923791   49088 type.go:168] "Request Body" body=""
	I1202 19:09:52.923863   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:52.924133   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:53.423850   49088 type.go:168] "Request Body" body=""
	I1202 19:09:53.423929   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:53.424252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:53.424366   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:53.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:09:53.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:53.923376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:54.422922   49088 type.go:168] "Request Body" body=""
	I1202 19:09:54.422999   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:54.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:54.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:09:54.923175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:54.923548   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:55.422950   49088 type.go:168] "Request Body" body=""
	I1202 19:09:55.423041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:55.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:55.923676   49088 type.go:168] "Request Body" body=""
	I1202 19:09:55.923766   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:55.924024   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:55.924066   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:56.423823   49088 type.go:168] "Request Body" body=""
	I1202 19:09:56.423900   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:56.424217   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:56.922947   49088 type.go:168] "Request Body" body=""
	I1202 19:09:56.923022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:56.923366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:57.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:09:57.422984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:57.423240   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:57.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:09:57.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:57.923361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:58.422959   49088 type.go:168] "Request Body" body=""
	I1202 19:09:58.423033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:58.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:58.423443   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:58.923738   49088 type.go:168] "Request Body" body=""
	I1202 19:09:58.923812   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:58.924072   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:59.423859   49088 type.go:168] "Request Body" body=""
	I1202 19:09:59.423937   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:59.424270   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:59.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:09:59.923052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:59.923398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:00.435478   49088 type.go:168] "Request Body" body=""
	I1202 19:10:00.435562   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:00.435862   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:00.435913   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:00.923635   49088 type.go:168] "Request Body" body=""
	I1202 19:10:00.923715   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:00.924056   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:01.423705   49088 type.go:168] "Request Body" body=""
	I1202 19:10:01.423779   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:01.424081   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:01.923812   49088 type.go:168] "Request Body" body=""
	I1202 19:10:01.923878   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:01.924156   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:02.423889   49088 type.go:168] "Request Body" body=""
	I1202 19:10:02.423969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:02.424269   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:02.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:02.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:02.923443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:02.923500   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:03.423151   49088 type.go:168] "Request Body" body=""
	I1202 19:10:03.423226   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:03.423486   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:03.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:10:03.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:03.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:04.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:10:04.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:04.423412   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:04.923351   49088 type.go:168] "Request Body" body=""
	I1202 19:10:04.923425   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:04.923691   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:04.923736   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:05.423841   49088 type.go:168] "Request Body" body=""
	I1202 19:10:05.423932   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:05.424309   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:05.922992   49088 type.go:168] "Request Body" body=""
	I1202 19:10:05.923078   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:05.923444   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:06.423123   49088 type.go:168] "Request Body" body=""
	I1202 19:10:06.423232   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:06.423495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:06.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:10:06.923050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:06.923408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:07.422988   49088 type.go:168] "Request Body" body=""
	I1202 19:10:07.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:07.423489   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:07.423547   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:07.923028   49088 type.go:168] "Request Body" body=""
	I1202 19:10:07.923107   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:07.923442   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:08.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:08.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:08.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:08.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:10:08.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:08.923365   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:09.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:10:09.423094   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:09.423439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:09.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:10:09.923062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:09.923410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:09.923465   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:10.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:10:10.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:10.423387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:10.922955   49088 type.go:168] "Request Body" body=""
	I1202 19:10:10.923025   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:10.923302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:11.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:11.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:11.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:11.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:10:11.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:11.923379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:12.422929   49088 type.go:168] "Request Body" body=""
	I1202 19:10:12.423003   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:12.423285   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:12.423327   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:12.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:10:12.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:12.923388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:13.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:13.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:13.423417   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:13.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:10:13.922997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:13.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:14.422948   49088 type.go:168] "Request Body" body=""
	I1202 19:10:14.423026   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:14.423361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:14.423418   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:14.923139   49088 type.go:168] "Request Body" body=""
	I1202 19:10:14.923221   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:14.923551   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:15.423337   49088 type.go:168] "Request Body" body=""
	I1202 19:10:15.423420   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:15.423733   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:15.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:10:15.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:15.923380   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:16.422956   49088 type.go:168] "Request Body" body=""
	I1202 19:10:16.423036   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:16.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:16.923752   49088 type.go:168] "Request Body" body=""
	I1202 19:10:16.923818   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:16.924073   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:16.924112   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:17.423826   49088 type.go:168] "Request Body" body=""
	I1202 19:10:17.423915   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:17.424256   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:17.922988   49088 type.go:168] "Request Body" body=""
	I1202 19:10:17.923068   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:17.923403   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:18.422876   49088 type.go:168] "Request Body" body=""
	I1202 19:10:18.422953   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:18.423220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:18.922913   49088 type.go:168] "Request Body" body=""
	I1202 19:10:18.922988   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:18.923326   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:19.423037   49088 type.go:168] "Request Body" body=""
	I1202 19:10:19.423111   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:19.423450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:19.423505   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:19.923780   49088 type.go:168] "Request Body" body=""
	I1202 19:10:19.923847   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:19.924112   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:20.423105   49088 type.go:168] "Request Body" body=""
	I1202 19:10:20.423212   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:20.423516   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:20.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:10:20.923060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:20.923378   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:21.423065   49088 type.go:168] "Request Body" body=""
	I1202 19:10:21.423140   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:21.423414   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:21.923020   49088 type.go:168] "Request Body" body=""
	I1202 19:10:21.923093   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:21.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:21.923415   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:22.423093   49088 type.go:168] "Request Body" body=""
	I1202 19:10:22.423166   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:22.423511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:22.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:10:22.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:22.923434   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:23.423060   49088 type.go:168] "Request Body" body=""
	I1202 19:10:23.423133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:23.423446   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:23.923195   49088 type.go:168] "Request Body" body=""
	I1202 19:10:23.923269   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:23.923577   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:23.923622   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:24.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:10:24.422957   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:24.423230   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:24.923115   49088 type.go:168] "Request Body" body=""
	I1202 19:10:24.923194   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:24.923600   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:25.423537   49088 type.go:168] "Request Body" body=""
	I1202 19:10:25.423610   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:25.423949   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:25.923703   49088 type.go:168] "Request Body" body=""
	I1202 19:10:25.923775   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:25.924043   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:25.924103   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:26.423902   49088 type.go:168] "Request Body" body=""
	I1202 19:10:26.423982   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:26.424292   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:26.922962   49088 type.go:168] "Request Body" body=""
	I1202 19:10:26.923073   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:26.923396   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:27.423706   49088 type.go:168] "Request Body" body=""
	I1202 19:10:27.423778   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:27.424090   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:27.923873   49088 type.go:168] "Request Body" body=""
	I1202 19:10:27.923954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:27.924307   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:27.924397   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:28.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:10:28.423017   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:28.423349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:28.922918   49088 type.go:168] "Request Body" body=""
	I1202 19:10:28.922988   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:28.923270   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:29.422990   49088 type.go:168] "Request Body" body=""
	I1202 19:10:29.423072   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:29.423426   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:29.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:10:29.923053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:29.923386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:30.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:10:30.423008   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:30.423306   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:30.423392   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:30.923054   49088 type.go:168] "Request Body" body=""
	I1202 19:10:30.923133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:30.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:31.423123   49088 type.go:168] "Request Body" body=""
	I1202 19:10:31.423203   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:31.423539   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:31.923853   49088 type.go:168] "Request Body" body=""
	I1202 19:10:31.923920   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:31.924180   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:32.422889   49088 type.go:168] "Request Body" body=""
	I1202 19:10:32.422970   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:32.423319   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:32.922910   49088 type.go:168] "Request Body" body=""
	I1202 19:10:32.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:32.923321   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:32.923375   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:33.423014   49088 type.go:168] "Request Body" body=""
	I1202 19:10:33.423095   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:33.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:33.923072   49088 type.go:168] "Request Body" body=""
	I1202 19:10:33.923148   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:33.923513   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:34.423108   49088 type.go:168] "Request Body" body=""
	I1202 19:10:34.423190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:34.423541   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:34.923286   49088 type.go:168] "Request Body" body=""
	I1202 19:10:34.923355   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:34.923630   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:34.923672   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:35.423752   49088 type.go:168] "Request Body" body=""
	I1202 19:10:35.423834   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:35.424190   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:35.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:10:35.922967   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:35.923295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:36.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:10:36.423054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:36.423305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:36.923037   49088 type.go:168] "Request Body" body=""
	I1202 19:10:36.923115   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:36.923442   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:37.423029   49088 type.go:168] "Request Body" body=""
	I1202 19:10:37.423102   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:37.423425   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:37.423480   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:37.923773   49088 type.go:168] "Request Body" body=""
	I1202 19:10:37.923861   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:37.924136   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:38.423899   49088 type.go:168] "Request Body" body=""
	I1202 19:10:38.423979   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:38.424296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:38.922970   49088 type.go:168] "Request Body" body=""
	I1202 19:10:38.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:38.923372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:39.423713   49088 type.go:168] "Request Body" body=""
	I1202 19:10:39.423780   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:39.424040   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:39.424081   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:39.923835   49088 type.go:168] "Request Body" body=""
	I1202 19:10:39.923908   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:39.924227   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:40.423209   49088 type.go:168] "Request Body" body=""
	I1202 19:10:40.423286   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:40.423612   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:40.923000   49088 type.go:168] "Request Body" body=""
	I1202 19:10:40.923083   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:40.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:41.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:41.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:41.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:41.922974   49088 type.go:168] "Request Body" body=""
	I1202 19:10:41.923048   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:41.923367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:41.923430   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:42.422892   49088 type.go:168] "Request Body" body=""
	I1202 19:10:42.422960   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:42.423218   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:42.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:10:42.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:42.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:43.422967   49088 type.go:168] "Request Body" body=""
	I1202 19:10:43.423039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:43.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:43.922911   49088 type.go:168] "Request Body" body=""
	I1202 19:10:43.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:43.923286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:44.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:44.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:44.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:44.423441   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:44.923165   49088 type.go:168] "Request Body" body=""
	I1202 19:10:44.923245   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:44.923572   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:45.423613   49088 type.go:168] "Request Body" body=""
	I1202 19:10:45.423695   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:45.423958   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:45.924133   49088 type.go:168] "Request Body" body=""
	I1202 19:10:45.924208   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:45.924557   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:46.423281   49088 type.go:168] "Request Body" body=""
	I1202 19:10:46.423365   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:46.423700   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:46.423760   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:46.923435   49088 type.go:168] "Request Body" body=""
	I1202 19:10:46.923504   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:46.923772   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:47.423542   49088 type.go:168] "Request Body" body=""
	I1202 19:10:47.423616   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:47.423946   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:47.923718   49088 type.go:168] "Request Body" body=""
	I1202 19:10:47.923790   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:47.924128   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:48.423446   49088 type.go:168] "Request Body" body=""
	I1202 19:10:48.423517   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:48.423772   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:48.423814   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:48.923540   49088 type.go:168] "Request Body" body=""
	I1202 19:10:48.923616   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:48.923948   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:49.423625   49088 type.go:168] "Request Body" body=""
	I1202 19:10:49.423703   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:49.424044   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:49.923749   49088 type.go:168] "Request Body" body=""
	I1202 19:10:49.923817   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:49.924118   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:50.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:10:50.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:50.423386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:50.923086   49088 type.go:168] "Request Body" body=""
	I1202 19:10:50.923163   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:50.923498   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:50.923558   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:51.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:10:51.422947   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:51.423236   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:51.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:51.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:51.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:52.423002   49088 type.go:168] "Request Body" body=""
	I1202 19:10:52.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:52.423410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:52.922879   49088 type.go:168] "Request Body" body=""
	I1202 19:10:52.922948   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:52.923224   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:53.422960   49088 type.go:168] "Request Body" body=""
	I1202 19:10:53.423034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:53.423361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:53.423412   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:53.923077   49088 type.go:168] "Request Body" body=""
	I1202 19:10:53.923157   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:53.923495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:54.422917   49088 type.go:168] "Request Body" body=""
	I1202 19:10:54.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:54.423246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:54.923135   49088 type.go:168] "Request Body" body=""
	I1202 19:10:54.923208   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:54.923556   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:55.423559   49088 type.go:168] "Request Body" body=""
	I1202 19:10:55.423636   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:55.423971   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:55.424022   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:55.923605   49088 type.go:168] "Request Body" body=""
	I1202 19:10:55.923678   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:55.923946   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:56.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:10:56.423787   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:56.424129   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:56.923785   49088 type.go:168] "Request Body" body=""
	I1202 19:10:56.923856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:56.924173   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:57.423656   49088 type.go:168] "Request Body" body=""
	I1202 19:10:57.423734   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:57.423996   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:57.923693   49088 type.go:168] "Request Body" body=""
	I1202 19:10:57.923764   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:57.924078   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:57.924131   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:58.423895   49088 type.go:168] "Request Body" body=""
	I1202 19:10:58.423973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:58.424286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:58.922875   49088 type.go:168] "Request Body" body=""
	I1202 19:10:58.922950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:58.923200   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:59.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:59.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:59.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:59.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:59.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:59.923368   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:00.423287   49088 type.go:168] "Request Body" body=""
	I1202 19:11:00.423360   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:00.423657   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:00.423700   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:00.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:11:00.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:00.923393   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:01.423104   49088 type.go:168] "Request Body" body=""
	I1202 19:11:01.423186   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:01.423527   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:01.923028   49088 type.go:168] "Request Body" body=""
	I1202 19:11:01.923097   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:01.923355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:02.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:11:02.423036   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:02.423393   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:02.923093   49088 type.go:168] "Request Body" body=""
	I1202 19:11:02.923169   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:02.923483   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:02.923543   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:03.422923   49088 type.go:168] "Request Body" body=""
	I1202 19:11:03.422993   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:03.423275   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:03.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:03.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:03.923401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:04.423109   49088 type.go:168] "Request Body" body=""
	I1202 19:11:04.423183   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:04.423480   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:04.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:11:04.923029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:04.923291   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:05.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:11:05.423019   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:05.423416   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:05.423485   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:05.922975   49088 type.go:168] "Request Body" body=""
	I1202 19:11:05.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:05.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:06.423068   49088 type.go:168] "Request Body" body=""
	I1202 19:11:06.423143   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:06.423404   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:06.922954   49088 type.go:168] "Request Body" body=""
	I1202 19:11:06.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:06.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:07.423090   49088 type.go:168] "Request Body" body=""
	I1202 19:11:07.423173   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:07.423511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:07.423577   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:07.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:11:07.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:07.923477   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:08.423196   49088 type.go:168] "Request Body" body=""
	I1202 19:11:08.423268   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:08.423618   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:08.923317   49088 type.go:168] "Request Body" body=""
	I1202 19:11:08.923395   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:08.923714   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:09.423471   49088 type.go:168] "Request Body" body=""
	I1202 19:11:09.423536   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:09.423793   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:09.423831   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:09.923592   49088 type.go:168] "Request Body" body=""
	I1202 19:11:09.923672   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:09.923995   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:10.422883   49088 type.go:168] "Request Body" body=""
	I1202 19:11:10.422965   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:10.423316   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:10.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:10.923035   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:10.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:11.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:11:11.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:11.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:11.923095   49088 type.go:168] "Request Body" body=""
	I1202 19:11:11.923169   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:11.923521   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:11.923576   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:12.423858   49088 type.go:168] "Request Body" body=""
	I1202 19:11:12.423929   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:12.424221   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:12.922920   49088 type.go:168] "Request Body" body=""
	I1202 19:11:12.923007   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:12.923376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:13.423103   49088 type.go:168] "Request Body" body=""
	I1202 19:11:13.423187   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:13.423573   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:13.923844   49088 type.go:168] "Request Body" body=""
	I1202 19:11:13.923913   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:13.924261   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:13.924357   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:14.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:11:14.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:14.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:14.923176   49088 type.go:168] "Request Body" body=""
	I1202 19:11:14.923259   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:14.923607   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:15.423538   49088 type.go:168] "Request Body" body=""
	I1202 19:11:15.423610   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:15.423918   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:15.923744   49088 type.go:168] "Request Body" body=""
	I1202 19:11:15.923820   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:15.924119   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:16.423897   49088 type.go:168] "Request Body" body=""
	I1202 19:11:16.423968   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:16.424281   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:16.424354   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:16.922924   49088 type.go:168] "Request Body" body=""
	I1202 19:11:16.923006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:16.923265   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:17.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:11:17.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:17.423367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:17.922982   49088 type.go:168] "Request Body" body=""
	I1202 19:11:17.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:17.923356   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:18.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:11:18.422949   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:18.423201   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:18.922936   49088 type.go:168] "Request Body" body=""
	I1202 19:11:18.923013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:18.923346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:18.923404   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:19.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:11:19.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:19.423374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:19.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:11:19.923009   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:19.923288   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:20.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:11:20.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:20.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:20.923042   49088 type.go:168] "Request Body" body=""
	I1202 19:11:20.923120   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:20.923474   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:20.923528   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:21.423170   49088 type.go:168] "Request Body" body=""
	I1202 19:11:21.423246   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:21.423533   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:21.922987   49088 type.go:168] "Request Body" body=""
	I1202 19:11:21.923069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:21.923428   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:22.423136   49088 type.go:168] "Request Body" body=""
	I1202 19:11:22.423218   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:22.423580   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:22.923816   49088 type.go:168] "Request Body" body=""
	I1202 19:11:22.923890   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:22.924148   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:22.924186   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:23.422866   49088 type.go:168] "Request Body" body=""
	I1202 19:11:23.422936   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:23.423286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:23.923262   49088 type.go:168] "Request Body" body=""
	I1202 19:11:23.923352   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:23.923994   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:24.422942   49088 type.go:168] "Request Body" body=""
	I1202 19:11:24.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:24.423374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:24.922937   49088 type.go:168] "Request Body" body=""
	I1202 19:11:24.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:24.923429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:25.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:11:25.423008   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:25.423357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:25.423414   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:25.922926   49088 type.go:168] "Request Body" body=""
	I1202 19:11:25.922991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:25.923253   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:26.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:11:26.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:26.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:26.923094   49088 type.go:168] "Request Body" body=""
	I1202 19:11:26.923165   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:26.923469   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:27.422920   49088 type.go:168] "Request Body" body=""
	I1202 19:11:27.422991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:27.423278   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:27.922888   49088 type.go:168] "Request Body" body=""
	I1202 19:11:27.922961   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:27.923314   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:27.923368   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:28.422912   49088 type.go:168] "Request Body" body=""
	I1202 19:11:28.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:28.423375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:28.923772   49088 type.go:168] "Request Body" body=""
	I1202 19:11:28.923838   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:28.924083   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:29.423850   49088 type.go:168] "Request Body" body=""
	I1202 19:11:29.423927   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:29.424260   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:29.923896   49088 type.go:168] "Request Body" body=""
	I1202 19:11:29.923968   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:29.924284   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:29.924358   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:30.423879   49088 type.go:168] "Request Body" body=""
	I1202 19:11:30.423953   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:30.424220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:30.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:11:30.923004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:30.923350   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:31.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:11:31.423147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:31.423507   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:31.922953   49088 type.go:168] "Request Body" body=""
	I1202 19:11:31.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:31.923337   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:32.422971   49088 type.go:168] "Request Body" body=""
	I1202 19:11:32.423046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:32.423366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:32.423417   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:32.923034   49088 type.go:168] "Request Body" body=""
	I1202 19:11:32.923109   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:32.923438   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:33.423135   49088 type.go:168] "Request Body" body=""
	I1202 19:11:33.423204   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:33.423464   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:33.922974   49088 type.go:168] "Request Body" body=""
	I1202 19:11:33.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:33.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:34.423091   49088 type.go:168] "Request Body" body=""
	I1202 19:11:34.423168   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:34.423523   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:34.423580   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:34.923239   49088 type.go:168] "Request Body" body=""
	I1202 19:11:34.923307   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:34.923573   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:35.423667   49088 type.go:168] "Request Body" body=""
	I1202 19:11:35.423743   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:35.424088   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:35.923861   49088 type.go:168] "Request Body" body=""
	I1202 19:11:35.923938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:35.924296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:36.422936   49088 type.go:168] "Request Body" body=""
	I1202 19:11:36.423001   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:36.423257   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:36.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:11:36.923029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:36.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:36.923429   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:37.422938   49088 type.go:168] "Request Body" body=""
	I1202 19:11:37.423016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:37.423347   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:37.923043   49088 type.go:168] "Request Body" body=""
	I1202 19:11:37.923123   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:37.923400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:38.422941   49088 type.go:168] "Request Body" body=""
	I1202 19:11:38.423011   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:38.423321   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:38.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:11:38.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:38.923352   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:39.422867   49088 type.go:168] "Request Body" body=""
	I1202 19:11:39.422947   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:39.423239   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:39.423286   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:39.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:11:39.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:39.923373   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:40.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:11:40.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:40.423412   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:40.923647   49088 type.go:168] "Request Body" body=""
	I1202 19:11:40.923717   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:40.923978   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:41.423742   49088 type.go:168] "Request Body" body=""
	I1202 19:11:41.423821   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:41.424168   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:41.424221   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:41.922872   49088 type.go:168] "Request Body" body=""
	I1202 19:11:41.922945   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:41.923310   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:42.423001   49088 type.go:168] "Request Body" body=""
	I1202 19:11:42.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:42.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:42.922973   49088 type.go:168] "Request Body" body=""
	I1202 19:11:42.923054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:42.923395   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:43.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:11:43.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:43.423368   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:43.922915   49088 type.go:168] "Request Body" body=""
	I1202 19:11:43.922986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:43.923252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:43.923292   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:44.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:11:44.423058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:44.423399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:44.923181   49088 type.go:168] "Request Body" body=""
	I1202 19:11:44.923261   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:44.923585   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:45.423566   49088 type.go:168] "Request Body" body=""
	I1202 19:11:45.423644   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:45.423912   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:45.923624   49088 type.go:168] "Request Body" body=""
	I1202 19:11:45.923703   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:45.924023   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:45.924071   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:46.423663   49088 type.go:168] "Request Body" body=""
	I1202 19:11:46.423734   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:46.424056   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:46.923091   49088 type.go:168] "Request Body" body=""
	I1202 19:11:46.923157   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:46.923456   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:47.423153   49088 type.go:168] "Request Body" body=""
	I1202 19:11:47.423230   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:47.423569   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:47.923278   49088 type.go:168] "Request Body" body=""
	I1202 19:11:47.923362   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:47.923689   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:48.422896   49088 type.go:168] "Request Body" body=""
	I1202 19:11:48.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:48.423231   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:48.423281   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:48.922955   49088 type.go:168] "Request Body" body=""
	I1202 19:11:48.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:48.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:49.423094   49088 type.go:168] "Request Body" body=""
	I1202 19:11:49.423172   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:49.423529   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:49.923206   49088 type.go:168] "Request Body" body=""
	I1202 19:11:49.923275   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:49.923532   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:50.423633   49088 type.go:168] "Request Body" body=""
	I1202 19:11:50.423711   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:50.424022   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:50.424076   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:50.923805   49088 type.go:168] "Request Body" body=""
	I1202 19:11:50.923878   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:50.924219   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:51.423760   49088 type.go:168] "Request Body" body=""
	I1202 19:11:51.423833   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:51.424113   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:51.923870   49088 type.go:168] "Request Body" body=""
	I1202 19:11:51.923950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:51.924286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:52.422868   49088 type.go:168] "Request Body" body=""
	I1202 19:11:52.422952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:52.423285   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:52.923892   49088 type.go:168] "Request Body" body=""
	I1202 19:11:52.923962   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:52.924248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:52.924291   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:53.422923   49088 type.go:168] "Request Body" body=""
	I1202 19:11:53.423002   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:53.423357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:53.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:11:53.923041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:53.923384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:54.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:11:54.422991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:54.423296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:54.922927   49088 type.go:168] "Request Body" body=""
	I1202 19:11:54.923011   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:54.923324   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:55.423007   49088 type.go:168] "Request Body" body=""
	I1202 19:11:55.423084   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:55.423406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:55.423459   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:55.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:55.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:55.923318   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:56.423006   49088 type.go:168] "Request Body" body=""
	I1202 19:11:56.423078   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:56.423413   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:56.923138   49088 type.go:168] "Request Body" body=""
	I1202 19:11:56.923215   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:56.923566   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:57.423251   49088 type.go:168] "Request Body" body=""
	I1202 19:11:57.423331   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:57.423624   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:57.423682   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:57.923555   49088 type.go:168] "Request Body" body=""
	I1202 19:11:57.923629   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:57.923962   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:58.423744   49088 type.go:168] "Request Body" body=""
	I1202 19:11:58.423824   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:58.424157   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:58.923666   49088 type.go:168] "Request Body" body=""
	I1202 19:11:58.923737   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:58.924002   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:59.423779   49088 type.go:168] "Request Body" body=""
	I1202 19:11:59.423856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:59.424204   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:59.424262   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:59.922940   49088 type.go:168] "Request Body" body=""
	I1202 19:11:59.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:59.923350   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:00.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:12:00.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:00.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:00.923010   49088 type.go:168] "Request Body" body=""
	I1202 19:12:00.923091   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:00.923432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:01.423015   49088 type.go:168] "Request Body" body=""
	I1202 19:12:01.423098   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:01.423440   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:01.923901   49088 type.go:168] "Request Body" body=""
	I1202 19:12:01.923978   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:01.924301   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:01.924373   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:02.423047   49088 type.go:168] "Request Body" body=""
	I1202 19:12:02.423120   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:02.423479   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:02.923197   49088 type.go:168] "Request Body" body=""
	I1202 19:12:02.923275   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:02.923599   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:03.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:12:03.422989   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:03.423273   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:03.922947   49088 type.go:168] "Request Body" body=""
	I1202 19:12:03.923023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:03.923352   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:04.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:12:04.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:04.423399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:04.423455   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:04.923130   49088 type.go:168] "Request Body" body=""
	I1202 19:12:04.923196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:04.923453   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:05.423379   49088 type.go:168] "Request Body" body=""
	I1202 19:12:05.423457   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:05.423797   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:05.923568   49088 type.go:168] "Request Body" body=""
	I1202 19:12:05.923643   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:05.923966   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:06.423488   49088 type.go:168] "Request Body" body=""
	I1202 19:12:06.423556   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:06.423829   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:06.423875   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:06.923674   49088 type.go:168] "Request Body" body=""
	I1202 19:12:06.923754   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:06.924076   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:07.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:12:07.423954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:07.424301   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:07.923933   49088 type.go:168] "Request Body" body=""
	I1202 19:12:07.924013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:07.924280   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:08.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:12:08.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:08.423411   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:08.923117   49088 type.go:168] "Request Body" body=""
	I1202 19:12:08.923190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:08.923511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:08.923573   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:09.422927   49088 type.go:168] "Request Body" body=""
	I1202 19:12:09.422996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:09.423311   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:09.923024   49088 type.go:168] "Request Body" body=""
	I1202 19:12:09.923100   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:09.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:10.422995   49088 type.go:168] "Request Body" body=""
	I1202 19:12:10.423070   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:10.423406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:10.922922   49088 type.go:168] "Request Body" body=""
	I1202 19:12:10.922997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:10.923336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:11.422925   49088 type.go:168] "Request Body" body=""
	I1202 19:12:11.423002   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:11.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:11.423398   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:11.923073   49088 type.go:168] "Request Body" body=""
	I1202 19:12:11.923147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:11.923465   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:12.422920   49088 type.go:168] "Request Body" body=""
	I1202 19:12:12.422996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:12.423266   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:12.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:12:12.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:12.923408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:13.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:12:13.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:13.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:13.423447   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:13.923740   49088 type.go:168] "Request Body" body=""
	I1202 19:12:13.923807   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:13.924068   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:14.423836   49088 type.go:168] "Request Body" body=""
	I1202 19:12:14.423917   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:14.424266   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:14.922896   49088 type.go:168] "Request Body" body=""
	I1202 19:12:14.922969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:14.923318   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:15.423109   49088 type.go:168] "Request Body" body=""
	I1202 19:12:15.423181   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:15.423435   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:15.423486   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:15.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:12:15.923045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:15.923399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:16.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:12:16.423056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:16.423395   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:16.922950   49088 type.go:168] "Request Body" body=""
	I1202 19:12:16.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:16.923357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:17.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:12:17.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:17.423403   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:17.923097   49088 type.go:168] "Request Body" body=""
	I1202 19:12:17.923175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:17.923503   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:17.923560   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:18.423204   49088 type.go:168] "Request Body" body=""
	I1202 19:12:18.423269   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:18.423531   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:18.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:12:18.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:18.923383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:19.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:12:19.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:19.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:19.923083   49088 type.go:168] "Request Body" body=""
	I1202 19:12:19.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:19.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:20.423485   49088 type.go:168] "Request Body" body=""
	I1202 19:12:20.423567   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:20.423913   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:20.423967   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:20.923722   49088 type.go:168] "Request Body" body=""
	I1202 19:12:20.923795   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:20.924168   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:21.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:12:21.422998   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:21.423294   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:21.922979   49088 type.go:168] "Request Body" body=""
	I1202 19:12:21.923058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:21.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:22.422989   49088 type.go:168] "Request Body" body=""
	I1202 19:12:22.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:22.423414   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:22.923563   49088 type.go:168] "Request Body" body=""
	I1202 19:12:22.923637   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:22.923893   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:22.923932   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:23.423651   49088 type.go:168] "Request Body" body=""
	I1202 19:12:23.423730   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:23.424077   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:23.923725   49088 type.go:168] "Request Body" body=""
	I1202 19:12:23.923795   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:23.924130   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:24.423644   49088 type.go:168] "Request Body" body=""
	I1202 19:12:24.423724   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:24.424004   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:24.923060   49088 type.go:168] "Request Body" body=""
	I1202 19:12:24.923142   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:24.923487   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:25.423281   49088 type.go:168] "Request Body" body=""
	I1202 19:12:25.423364   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:25.423721   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:25.423779   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:25.923482   49088 type.go:168] "Request Body" body=""
	I1202 19:12:25.923550   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:25.923808   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:26.423542   49088 type.go:168] "Request Body" body=""
	I1202 19:12:26.423613   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:26.423935   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:26.923724   49088 type.go:168] "Request Body" body=""
	I1202 19:12:26.923807   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:26.924187   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:27.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:12:27.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:27.423252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:27.922940   49088 type.go:168] "Request Body" body=""
	I1202 19:12:27.923019   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:27.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:27.923403   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:28.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:12:28.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:28.423432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:28.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:12:28.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:28.923349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:29.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:12:29.423092   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:29.423421   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:29.923070   49088 type.go:168] "Request Body" body=""
	I1202 19:12:29.923147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:29.923486   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:29.923558   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:30.423578   49088 type.go:168] "Request Body" body=""
	I1202 19:12:30.423659   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:30.423950   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:30.923208   49088 type.go:168] "Request Body" body=""
	I1202 19:12:30.923281   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:30.923639   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:31.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:12:31.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:31.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:31.922934   49088 type.go:168] "Request Body" body=""
	I1202 19:12:31.923000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:31.923257   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:32.422953   49088 type.go:168] "Request Body" body=""
	I1202 19:12:32.423028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:32.423354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:32.423413   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:32.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:12:32.923168   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:32.923564   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:33.423262   49088 type.go:168] "Request Body" body=""
	I1202 19:12:33.423340   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:33.423598   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:33.922973   49088 type.go:168] "Request Body" body=""
	I1202 19:12:33.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:33.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:34.422949   49088 type.go:168] "Request Body" body=""
	I1202 19:12:34.423022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:34.423336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:34.922878   49088 type.go:168] "Request Body" body=""
	I1202 19:12:34.922943   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:34.923200   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:34.923239   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:35.423165   49088 type.go:168] "Request Body" body=""
	I1202 19:12:35.423238   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:35.423568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:35.923256   49088 type.go:168] "Request Body" body=""
	I1202 19:12:35.923329   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:35.923637   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:36.422898   49088 type.go:168] "Request Body" body=""
	I1202 19:12:36.422965   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:36.423218   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:36.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:12:36.923035   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:36.923355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:36.923409   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:37.423086   49088 type.go:168] "Request Body" body=""
	I1202 19:12:37.423161   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:37.423449   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:37.923834   49088 type.go:168] "Request Body" body=""
	I1202 19:12:37.923903   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:37.924159   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:38.422906   49088 type.go:168] "Request Body" body=""
	I1202 19:12:38.422977   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:38.423261   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:38.922970   49088 type.go:168] "Request Body" body=""
	I1202 19:12:38.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:38.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:38.923440   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:39.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:12:39.423788   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:39.424096   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:39.923879   49088 type.go:168] "Request Body" body=""
	I1202 19:12:39.923950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:39.924267   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:40.423003   49088 type.go:168] "Request Body" body=""
	I1202 19:12:40.423081   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:40.423385   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:40.923058   49088 type.go:168] "Request Body" body=""
	I1202 19:12:40.923129   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:40.923426   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:40.923486   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:41.422953   49088 type.go:168] "Request Body" body=""
	I1202 19:12:41.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:41.423343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:41.923078   49088 type.go:168] "Request Body" body=""
	I1202 19:12:41.923156   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:41.923473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:42.423139   49088 type.go:168] "Request Body" body=""
	I1202 19:12:42.423204   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:42.423502   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:42.923011   49088 type.go:168] "Request Body" body=""
	I1202 19:12:42.923090   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:42.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:43.423000   49088 type.go:168] "Request Body" body=""
	I1202 19:12:43.423079   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:43.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:43.423446   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:43.923623   49088 type.go:168] "Request Body" body=""
	I1202 19:12:43.923696   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:43.923958   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:44.423707   49088 type.go:168] "Request Body" body=""
	I1202 19:12:44.423805   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:44.424192   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:44.923041   49088 type.go:168] "Request Body" body=""
	I1202 19:12:44.923114   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:44.923460   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:45.423487   49088 type.go:168] "Request Body" body=""
	I1202 19:12:45.423555   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:45.423816   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:45.423856   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:45.923602   49088 type.go:168] "Request Body" body=""
	I1202 19:12:45.923678   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:45.924003   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:46.423772   49088 type.go:168] "Request Body" body=""
	I1202 19:12:46.423843   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:46.424193   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:46.923702   49088 type.go:168] "Request Body" body=""
	I1202 19:12:46.923775   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:46.924028   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:47.423742   49088 type.go:168] "Request Body" body=""
	I1202 19:12:47.423811   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:47.424126   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:47.424184   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:47.923682   49088 type.go:168] "Request Body" body=""
	I1202 19:12:47.923759   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:47.924070   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:48.423650   49088 type.go:168] "Request Body" body=""
	I1202 19:12:48.423719   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:48.423981   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:48.923704   49088 type.go:168] "Request Body" body=""
	I1202 19:12:48.923774   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:48.924090   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:49.423900   49088 type.go:168] "Request Body" body=""
	I1202 19:12:49.423983   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:49.424354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:49.424408   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:49.922950   49088 type.go:168] "Request Body" body=""
	I1202 19:12:49.923023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:49.923365   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:50.423243   49088 type.go:168] "Request Body" body=""
	I1202 19:12:50.423322   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:50.423653   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:50.922964   49088 type.go:168] "Request Body" body=""
	I1202 19:12:50.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:50.923377   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:51.422954   49088 type.go:168] "Request Body" body=""
	I1202 19:12:51.423023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:51.423346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:51.923047   49088 type.go:168] "Request Body" body=""
	I1202 19:12:51.923126   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:51.923460   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:51.923513   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:52.422937   49088 type.go:168] "Request Body" body=""
	I1202 19:12:52.423014   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:52.423303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:52.922929   49088 type.go:168] "Request Body" body=""
	I1202 19:12:52.923016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:52.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:53.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:12:53.423100   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:53.423482   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:53.922987   49088 type.go:168] "Request Body" body=""
	I1202 19:12:53.923061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:53.923413   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:54.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:12:54.422983   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:54.423246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:54.423296   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:54.923072   49088 type.go:168] "Request Body" body=""
	I1202 19:12:54.923153   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:54.923523   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:55.423518   49088 type.go:168] "Request Body" body=""
	I1202 19:12:55.423603   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:55.423968   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:55.923602   49088 type.go:168] "Request Body" body=""
	I1202 19:12:55.923672   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:55.924000   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:56.423806   49088 type.go:168] "Request Body" body=""
	I1202 19:12:56.423881   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:56.424245   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:56.424298   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:56.923893   49088 type.go:168] "Request Body" body=""
	I1202 19:12:56.923967   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:56.924355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:57.422930   49088 type.go:168] "Request Body" body=""
	I1202 19:12:57.422997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:57.423287   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:57.922958   49088 type.go:168] "Request Body" body=""
	I1202 19:12:57.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:57.923341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:58.423065   49088 type.go:168] "Request Body" body=""
	I1202 19:12:58.423137   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:58.423498   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:58.923185   49088 type.go:168] "Request Body" body=""
	I1202 19:12:58.923255   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:58.923518   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:58.923557   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:59.422988   49088 type.go:168] "Request Body" body=""
	I1202 19:12:59.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:59.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:59.923116   49088 type.go:168] "Request Body" body=""
	I1202 19:12:59.923196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:59.923537   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:00.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:13:00.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:00.423421   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:00.922957   49088 type.go:168] "Request Body" body=""
	I1202 19:13:00.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:00.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:01.423025   49088 type.go:168] "Request Body" body=""
	I1202 19:13:01.423098   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:01.423429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:01.423485   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:01.923135   49088 type.go:168] "Request Body" body=""
	I1202 19:13:01.923210   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:01.923470   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:02.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:13:02.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:02.423386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:02.923090   49088 type.go:168] "Request Body" body=""
	I1202 19:13:02.923194   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:02.923514   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:03.423202   49088 type.go:168] "Request Body" body=""
	I1202 19:13:03.423271   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:03.423580   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:03.423632   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:03.922949   49088 type.go:168] "Request Body" body=""
	I1202 19:13:03.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:03.923344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:04.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:13:04.423049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:04.423409   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:04.923130   49088 type.go:168] "Request Body" body=""
	I1202 19:13:04.923198   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:04.923485   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:05.423469   49088 type.go:168] "Request Body" body=""
	I1202 19:13:05.423540   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:05.423887   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:05.423943   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:05.923727   49088 type.go:168] "Request Body" body=""
	I1202 19:13:05.923801   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:05.924115   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:06.423862   49088 type.go:168] "Request Body" body=""
	I1202 19:13:06.423938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:06.424199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:06.922912   49088 type.go:168] "Request Body" body=""
	I1202 19:13:06.922984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:06.923325   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:07.422978   49088 type.go:168] "Request Body" body=""
	I1202 19:13:07.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:07.423373   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:07.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:13:07.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:07.923258   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:07.923298   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:08.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:13:08.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:08.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:08.922933   49088 type.go:168] "Request Body" body=""
	I1202 19:13:08.923006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:08.923329   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:09.422864   49088 type.go:168] "Request Body" body=""
	I1202 19:13:09.422931   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:09.423213   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:09.922936   49088 type.go:168] "Request Body" body=""
	I1202 19:13:09.923016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:09.923330   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:09.923387   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:10.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:13:10.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:10.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:10.922900   49088 type.go:168] "Request Body" body=""
	I1202 19:13:10.922972   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:10.923227   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:11.422967   49088 type.go:168] "Request Body" body=""
	I1202 19:13:11.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:11.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:11.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:13:11.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:11.923347   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:12.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:13:12.422981   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:12.423291   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:12.423344   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:12.922990   49088 type.go:168] "Request Body" body=""
	I1202 19:13:12.923081   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:12.923443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:13.423156   49088 type.go:168] "Request Body" body=""
	I1202 19:13:13.423234   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:13.423549   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:13.922900   49088 type.go:168] "Request Body" body=""
	I1202 19:13:13.922969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:13.923235   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:14.422957   49088 type.go:168] "Request Body" body=""
	I1202 19:13:14.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:14.423396   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:14.423449   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:14.923039   49088 type.go:168] "Request Body" body=""
	I1202 19:13:14.923111   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:14.923397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:15.423223   49088 type.go:168] "Request Body" body=""
	I1202 19:13:15.423302   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:15.423557   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:15.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:13:15.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:15.923367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:16.423062   49088 type.go:168] "Request Body" body=""
	I1202 19:13:16.423140   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:16.423473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:16.423529   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:16.923839   49088 type.go:168] "Request Body" body=""
	I1202 19:13:16.923917   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:16.924188   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:17.422894   49088 type.go:168] "Request Body" body=""
	I1202 19:13:17.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:17.423308   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:17.922909   49088 type.go:168] "Request Body" body=""
	I1202 19:13:17.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:17.923348   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:18.423042   49088 type.go:168] "Request Body" body=""
	I1202 19:13:18.423112   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:18.423378   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:18.923054   49088 type.go:168] "Request Body" body=""
	I1202 19:13:18.923129   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:18.923451   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:18.923507   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:19.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:13:19.423063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:19.423405   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:19.923563   49088 type.go:168] "Request Body" body=""
	I1202 19:13:19.923634   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:19.923942   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:20.423766   49088 type.go:168] "Request Body" body=""
	I1202 19:13:20.423836   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:20.424183   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:20.922889   49088 type.go:168] "Request Body" body=""
	I1202 19:13:20.922963   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:20.923286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:21.422910   49088 type.go:168] "Request Body" body=""
	I1202 19:13:21.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:21.423265   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:21.423304   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:21.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:13:21.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:21.923372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:22.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:13:22.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:22.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:22.923660   49088 type.go:168] "Request Body" body=""
	I1202 19:13:22.923726   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:22.923989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:23.423852   49088 type.go:168] "Request Body" body=""
	I1202 19:13:23.423930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:23.424355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:23.424410   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:23.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:13:23.923088   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:23.923457   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:24.423150   49088 type.go:168] "Request Body" body=""
	I1202 19:13:24.423233   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:24.423566   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:24.923507   49088 type.go:168] "Request Body" body=""
	I1202 19:13:24.923591   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:24.923948   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:25.423717   49088 type.go:168] "Request Body" body=""
	I1202 19:13:25.423797   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:25.424161   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:25.922841   49088 node_ready.go:38] duration metric: took 6m0.000085627s for node "functional-449836" to be "Ready" ...
	I1202 19:13:25.925875   49088 out.go:203] 
	W1202 19:13:25.928738   49088 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1202 19:13:25.928760   49088 out.go:285] * 
	W1202 19:13:25.930899   49088 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 19:13:25.934748   49088 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.957986214Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.957997226Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958008417Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958018034Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958036085Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958054506Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958070481Z" level=info msg="runtime interface created"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958076692Z" level=info msg="created NRI interface"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958098690Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958133217Z" level=info msg="Connect containerd service"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.958460151Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.959114281Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.969745294Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.969836502Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.969857893Z" level=info msg="Start subscribing containerd event"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.969939477Z" level=info msg="Start recovering state"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992144454Z" level=info msg="Start event monitor"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992422247Z" level=info msg="Start cni network conf syncer for default"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992500294Z" level=info msg="Start streaming server"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992620565Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992683392Z" level=info msg="runtime interface starting up..."
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992736816Z" level=info msg="starting plugins..."
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.992796598Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 02 19:07:22 functional-449836 systemd[1]: Started containerd.service - containerd container runtime.
	Dec 02 19:07:22 functional-449836 containerd[5842]: time="2025-12-02T19:07:22.994992948Z" level=info msg="containerd successfully booted in 0.056864s"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:13:30.339413    9161 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:30.339999    9161 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:30.341620    9161 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:30.342038    9161 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:30.343511    9161 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:13:30 up 55 min,  0 user,  load average: 0.32, 0.32, 0.51
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:13:26 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:27 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 812.
	Dec 02 19:13:27 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:27 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:27 functional-449836 kubelet[8936]: E1202 19:13:27.487415    8936 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:27 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:27 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:28 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 813.
	Dec 02 19:13:28 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:28 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:28 functional-449836 kubelet[9032]: E1202 19:13:28.233519    9032 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:28 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:28 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:28 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 814.
	Dec 02 19:13:28 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:28 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:28 functional-449836 kubelet[9053]: E1202 19:13:28.990126    9053 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:28 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:28 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:29 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 815.
	Dec 02 19:13:29 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:29 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:29 functional-449836 kubelet[9074]: E1202 19:13:29.736118    9074 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:29 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:29 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (363.957609ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubectlGetPods (2.32s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd (2.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 kubectl -- --context functional-449836 get pods
functional_test.go:731: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 kubectl -- --context functional-449836 get pods: exit status 1 (106.652626ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:734: failed to get pods. args "out/minikube-linux-arm64 -p functional-449836 kubectl -- --context functional-449836 get pods": exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (305.987744ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-224594 image ls --format short --alsologtostderr                                                                                             │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image ls --format yaml --alsologtostderr                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh     │ functional-224594 ssh pgrep buildkitd                                                                                                                   │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ image   │ functional-224594 image ls --format json --alsologtostderr                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image ls --format table --alsologtostderr                                                                                             │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image build -t localhost/my-image:functional-224594 testdata/build --alsologtostderr                                                  │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image ls                                                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ delete  │ -p functional-224594                                                                                                                                    │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ start   │ -p functional-449836 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ start   │ -p functional-449836 --alsologtostderr -v=8                                                                                                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:07 UTC │                     │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:latest                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add minikube-local-cache-test:functional-449836                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache delete minikube-local-cache-test:functional-449836                                                                              │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl images                                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	│ cache   │ functional-449836 cache reload                                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ kubectl │ functional-449836 kubectl -- --context functional-449836 get pods                                                                                       │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:07:19
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:07:19.929855   49088 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:07:19.930082   49088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:07:19.930109   49088 out.go:374] Setting ErrFile to fd 2...
	I1202 19:07:19.930127   49088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:07:19.930424   49088 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:07:19.930829   49088 out.go:368] Setting JSON to false
	I1202 19:07:19.931678   49088 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":2976,"bootTime":1764699464,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:07:19.931776   49088 start.go:143] virtualization:  
	I1202 19:07:19.935245   49088 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:07:19.939094   49088 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:07:19.939188   49088 notify.go:221] Checking for updates...
	I1202 19:07:19.944799   49088 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:07:19.947646   49088 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:19.950501   49088 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:07:19.953361   49088 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:07:19.956281   49088 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:07:19.959695   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:19.959887   49088 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:07:19.996438   49088 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:07:19.996577   49088 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:07:20.063124   49088 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:07:20.053388152 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:07:20.063232   49088 docker.go:319] overlay module found
	I1202 19:07:20.066390   49088 out.go:179] * Using the docker driver based on existing profile
	I1202 19:07:20.069271   49088 start.go:309] selected driver: docker
	I1202 19:07:20.069311   49088 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:20.069422   49088 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:07:20.069541   49088 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:07:20.132012   49088 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:07:20.122627931 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:07:20.132615   49088 cni.go:84] Creating CNI manager for ""
	I1202 19:07:20.132692   49088 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:07:20.132751   49088 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:20.135845   49088 out.go:179] * Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	I1202 19:07:20.138639   49088 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 19:07:20.141498   49088 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 19:07:20.144479   49088 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 19:07:20.144604   49088 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:07:20.163347   49088 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 19:07:20.163372   49088 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 19:07:20.218193   49088 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 19:07:20.422833   49088 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 19:07:20.423042   49088 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 19:07:20.423128   49088 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423219   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 19:07:20.423234   49088 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 122.125µs
	I1202 19:07:20.423249   49088 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 19:07:20.423267   49088 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423303   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 19:07:20.423312   49088 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 47.557µs
	I1202 19:07:20.423318   49088 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423331   49088 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423365   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 19:07:20.423374   49088 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 44.415µs
	I1202 19:07:20.423380   49088 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423395   49088 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423422   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 19:07:20.423432   49088 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 37.579µs
	I1202 19:07:20.423438   49088 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423447   49088 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423476   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 19:07:20.423484   49088 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 38.933µs
	I1202 19:07:20.423490   49088 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423510   49088 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423540   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 19:07:20.423549   49088 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 40.796µs
	I1202 19:07:20.423555   49088 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 19:07:20.423569   49088 cache.go:243] Successfully downloaded all kic artifacts
	I1202 19:07:20.423588   49088 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423620   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 19:07:20.423629   49088 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 42.487µs
	I1202 19:07:20.423635   49088 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 19:07:20.423646   49088 start.go:360] acquireMachinesLock for functional-449836: {Name:mk8999fdfa518fc15358d07431fe9bec286a035e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423706   49088 start.go:364] duration metric: took 31.868µs to acquireMachinesLock for "functional-449836"
	I1202 19:07:20.423570   49088 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423753   49088 start.go:96] Skipping create...Using existing machine configuration
	I1202 19:07:20.423783   49088 fix.go:54] fixHost starting: 
	I1202 19:07:20.423759   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 19:07:20.423888   49088 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 323.2µs
	I1202 19:07:20.423896   49088 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 19:07:20.423906   49088 cache.go:87] Successfully saved all images to host disk.
	I1202 19:07:20.424111   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:20.441213   49088 fix.go:112] recreateIfNeeded on functional-449836: state=Running err=<nil>
	W1202 19:07:20.441244   49088 fix.go:138] unexpected machine state, will restart: <nil>
	I1202 19:07:20.444707   49088 out.go:252] * Updating the running docker "functional-449836" container ...
	I1202 19:07:20.444749   49088 machine.go:94] provisionDockerMachine start ...
	I1202 19:07:20.444842   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.461943   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.462269   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.462284   49088 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 19:07:20.612055   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:07:20.612125   49088 ubuntu.go:182] provisioning hostname "functional-449836"
	I1202 19:07:20.612222   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.629856   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.630166   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.630180   49088 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-449836 && echo "functional-449836" | sudo tee /etc/hostname
	I1202 19:07:20.793419   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:07:20.793536   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.812441   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.812754   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.812775   49088 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-449836' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-449836/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-449836' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 19:07:20.961443   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 19:07:20.961480   49088 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 19:07:20.961539   49088 ubuntu.go:190] setting up certificates
	I1202 19:07:20.961556   49088 provision.go:84] configureAuth start
	I1202 19:07:20.961634   49088 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:07:20.990731   49088 provision.go:143] copyHostCerts
	I1202 19:07:20.990790   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:07:20.990838   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 19:07:20.990856   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:07:20.990938   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 19:07:20.991037   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:07:20.991060   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 19:07:20.991069   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:07:20.991098   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 19:07:20.991189   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:07:20.991211   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 19:07:20.991220   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:07:20.991247   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 19:07:20.991297   49088 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.functional-449836 san=[127.0.0.1 192.168.49.2 functional-449836 localhost minikube]
	I1202 19:07:21.335552   49088 provision.go:177] copyRemoteCerts
	I1202 19:07:21.335618   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 19:07:21.335658   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.354079   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.460475   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1202 19:07:21.460535   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1202 19:07:21.478965   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1202 19:07:21.479028   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 19:07:21.497363   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1202 19:07:21.497471   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 19:07:21.514946   49088 provision.go:87] duration metric: took 553.36724ms to configureAuth
	I1202 19:07:21.515020   49088 ubuntu.go:206] setting minikube options for container-runtime
	I1202 19:07:21.515215   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:21.515248   49088 machine.go:97] duration metric: took 1.070490831s to provisionDockerMachine
	I1202 19:07:21.515264   49088 start.go:293] postStartSetup for "functional-449836" (driver="docker")
	I1202 19:07:21.515276   49088 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 19:07:21.515329   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 19:07:21.515382   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.532644   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.636416   49088 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 19:07:21.639685   49088 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1202 19:07:21.639756   49088 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1202 19:07:21.639777   49088 command_runner.go:130] > VERSION_ID="12"
	I1202 19:07:21.639798   49088 command_runner.go:130] > VERSION="12 (bookworm)"
	I1202 19:07:21.639827   49088 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1202 19:07:21.639832   49088 command_runner.go:130] > ID=debian
	I1202 19:07:21.639847   49088 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1202 19:07:21.639859   49088 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1202 19:07:21.639866   49088 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1202 19:07:21.639943   49088 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 19:07:21.639962   49088 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 19:07:21.639974   49088 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 19:07:21.640036   49088 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 19:07:21.640112   49088 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 19:07:21.640123   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> /etc/ssl/certs/44352.pem
	I1202 19:07:21.640204   49088 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> hosts in /etc/test/nested/copy/4435
	I1202 19:07:21.640213   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> /etc/test/nested/copy/4435/hosts
	I1202 19:07:21.640263   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4435
	I1202 19:07:21.647807   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:07:21.664872   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts --> /etc/test/nested/copy/4435/hosts (40 bytes)
	I1202 19:07:21.686465   49088 start.go:296] duration metric: took 171.184702ms for postStartSetup
	I1202 19:07:21.686545   49088 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:07:21.686646   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.708068   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.808826   49088 command_runner.go:130] > 18%
	I1202 19:07:21.809461   49088 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 19:07:21.814183   49088 command_runner.go:130] > 159G
	I1202 19:07:21.814719   49088 fix.go:56] duration metric: took 1.390932828s for fixHost
	I1202 19:07:21.814741   49088 start.go:83] releasing machines lock for "functional-449836", held for 1.391011327s
	I1202 19:07:21.814809   49088 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:07:21.831833   49088 ssh_runner.go:195] Run: cat /version.json
	I1202 19:07:21.831895   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.832169   49088 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 19:07:21.832229   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.852617   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.855772   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.955939   49088 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1764169655-21974", "minikube_version": "v1.37.0", "commit": "5499406178e21d60d74d327c9716de794e8a4797"}
	I1202 19:07:21.956090   49088 ssh_runner.go:195] Run: systemctl --version
	I1202 19:07:22.048548   49088 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1202 19:07:22.051368   49088 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1202 19:07:22.051402   49088 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1202 19:07:22.051488   49088 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1202 19:07:22.055900   49088 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1202 19:07:22.056072   49088 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 19:07:22.056144   49088 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 19:07:22.064483   49088 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1202 19:07:22.064507   49088 start.go:496] detecting cgroup driver to use...
	I1202 19:07:22.064540   49088 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 19:07:22.064608   49088 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 19:07:22.080944   49088 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 19:07:22.094328   49088 docker.go:218] disabling cri-docker service (if available) ...
	I1202 19:07:22.094412   49088 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 19:07:22.110538   49088 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 19:07:22.123916   49088 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 19:07:22.251555   49088 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 19:07:22.372403   49088 docker.go:234] disabling docker service ...
	I1202 19:07:22.372547   49088 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 19:07:22.390362   49088 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 19:07:22.404129   49088 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 19:07:22.527674   49088 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 19:07:22.641245   49088 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 19:07:22.654510   49088 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 19:07:22.669149   49088 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1202 19:07:22.670616   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 19:07:22.680782   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 19:07:22.690619   49088 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 19:07:22.690690   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 19:07:22.700650   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:07:22.710637   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 19:07:22.720237   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:07:22.730375   49088 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 19:07:22.738458   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 19:07:22.747256   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 19:07:22.756269   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 19:07:22.765824   49088 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 19:07:22.772632   49088 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1202 19:07:22.773683   49088 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 19:07:22.781384   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:22.894036   49088 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 19:07:22.996092   49088 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 19:07:22.996190   49088 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 19:07:23.000049   49088 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1202 19:07:23.000075   49088 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1202 19:07:23.000083   49088 command_runner.go:130] > Device: 0,72	Inode: 1611        Links: 1
	I1202 19:07:23.000090   49088 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1202 19:07:23.000119   49088 command_runner.go:130] > Access: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000134   49088 command_runner.go:130] > Modify: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000139   49088 command_runner.go:130] > Change: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000143   49088 command_runner.go:130] >  Birth: -
	I1202 19:07:23.000708   49088 start.go:564] Will wait 60s for crictl version
	I1202 19:07:23.000798   49088 ssh_runner.go:195] Run: which crictl
	I1202 19:07:23.004553   49088 command_runner.go:130] > /usr/local/bin/crictl
	I1202 19:07:23.004698   49088 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 19:07:23.031006   49088 command_runner.go:130] > Version:  0.1.0
	I1202 19:07:23.031142   49088 command_runner.go:130] > RuntimeName:  containerd
	I1202 19:07:23.031156   49088 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1202 19:07:23.031165   49088 command_runner.go:130] > RuntimeApiVersion:  v1
	I1202 19:07:23.033497   49088 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 19:07:23.033588   49088 ssh_runner.go:195] Run: containerd --version
	I1202 19:07:23.053512   49088 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1202 19:07:23.055064   49088 ssh_runner.go:195] Run: containerd --version
	I1202 19:07:23.073280   49088 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1202 19:07:23.080684   49088 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 19:07:23.083736   49088 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 19:07:23.100485   49088 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1202 19:07:23.104603   49088 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1202 19:07:23.104709   49088 kubeadm.go:884] updating cluster {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 19:07:23.104831   49088 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:07:23.104890   49088 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 19:07:23.127690   49088 command_runner.go:130] > {
	I1202 19:07:23.127710   49088 command_runner.go:130] >   "images":  [
	I1202 19:07:23.127715   49088 command_runner.go:130] >     {
	I1202 19:07:23.127725   49088 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1202 19:07:23.127729   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127744   49088 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1202 19:07:23.127750   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127755   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127759   49088 command_runner.go:130] >       "size":  "8032639",
	I1202 19:07:23.127765   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127776   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127781   49088 command_runner.go:130] >     },
	I1202 19:07:23.127784   49088 command_runner.go:130] >     {
	I1202 19:07:23.127792   49088 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1202 19:07:23.127800   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127806   49088 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1202 19:07:23.127813   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127817   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127822   49088 command_runner.go:130] >       "size":  "21166088",
	I1202 19:07:23.127826   49088 command_runner.go:130] >       "username":  "nonroot",
	I1202 19:07:23.127832   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127835   49088 command_runner.go:130] >     },
	I1202 19:07:23.127838   49088 command_runner.go:130] >     {
	I1202 19:07:23.127845   49088 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1202 19:07:23.127855   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127869   49088 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1202 19:07:23.127876   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127880   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127887   49088 command_runner.go:130] >       "size":  "21134420",
	I1202 19:07:23.127892   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.127899   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.127903   49088 command_runner.go:130] >       },
	I1202 19:07:23.127907   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127911   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127917   49088 command_runner.go:130] >     },
	I1202 19:07:23.127919   49088 command_runner.go:130] >     {
	I1202 19:07:23.127926   49088 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1202 19:07:23.127930   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127938   49088 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1202 19:07:23.127945   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127949   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127953   49088 command_runner.go:130] >       "size":  "24676285",
	I1202 19:07:23.127961   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.127965   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.127971   49088 command_runner.go:130] >       },
	I1202 19:07:23.127975   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127983   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127987   49088 command_runner.go:130] >     },
	I1202 19:07:23.127996   49088 command_runner.go:130] >     {
	I1202 19:07:23.128002   49088 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1202 19:07:23.128006   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128012   49088 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1202 19:07:23.128015   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128019   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128026   49088 command_runner.go:130] >       "size":  "20658969",
	I1202 19:07:23.128029   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128033   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.128041   49088 command_runner.go:130] >       },
	I1202 19:07:23.128052   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128059   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128063   49088 command_runner.go:130] >     },
	I1202 19:07:23.128070   49088 command_runner.go:130] >     {
	I1202 19:07:23.128077   49088 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1202 19:07:23.128081   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128088   49088 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1202 19:07:23.128092   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128096   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128099   49088 command_runner.go:130] >       "size":  "22428165",
	I1202 19:07:23.128103   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128109   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128113   49088 command_runner.go:130] >     },
	I1202 19:07:23.128116   49088 command_runner.go:130] >     {
	I1202 19:07:23.128123   49088 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1202 19:07:23.128130   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128135   49088 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1202 19:07:23.128143   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128152   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128160   49088 command_runner.go:130] >       "size":  "15389290",
	I1202 19:07:23.128163   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128167   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.128170   49088 command_runner.go:130] >       },
	I1202 19:07:23.128175   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128179   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128185   49088 command_runner.go:130] >     },
	I1202 19:07:23.128188   49088 command_runner.go:130] >     {
	I1202 19:07:23.128199   49088 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1202 19:07:23.128203   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128212   49088 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1202 19:07:23.128215   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128223   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128227   49088 command_runner.go:130] >       "size":  "265458",
	I1202 19:07:23.128238   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128243   49088 command_runner.go:130] >         "value":  "65535"
	I1202 19:07:23.128248   49088 command_runner.go:130] >       },
	I1202 19:07:23.128252   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128256   49088 command_runner.go:130] >       "pinned":  true
	I1202 19:07:23.128259   49088 command_runner.go:130] >     }
	I1202 19:07:23.128262   49088 command_runner.go:130] >   ]
	I1202 19:07:23.128265   49088 command_runner.go:130] > }
	I1202 19:07:23.130379   49088 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 19:07:23.130403   49088 cache_images.go:86] Images are preloaded, skipping loading
	I1202 19:07:23.130410   49088 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1202 19:07:23.130509   49088 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-449836 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 19:07:23.130576   49088 ssh_runner.go:195] Run: sudo crictl info
	I1202 19:07:23.152707   49088 command_runner.go:130] > {
	I1202 19:07:23.152731   49088 command_runner.go:130] >   "cniconfig": {
	I1202 19:07:23.152737   49088 command_runner.go:130] >     "Networks": [
	I1202 19:07:23.152741   49088 command_runner.go:130] >       {
	I1202 19:07:23.152746   49088 command_runner.go:130] >         "Config": {
	I1202 19:07:23.152752   49088 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1202 19:07:23.152758   49088 command_runner.go:130] >           "Name": "cni-loopback",
	I1202 19:07:23.152768   49088 command_runner.go:130] >           "Plugins": [
	I1202 19:07:23.152775   49088 command_runner.go:130] >             {
	I1202 19:07:23.152779   49088 command_runner.go:130] >               "Network": {
	I1202 19:07:23.152784   49088 command_runner.go:130] >                 "ipam": {},
	I1202 19:07:23.152789   49088 command_runner.go:130] >                 "type": "loopback"
	I1202 19:07:23.152798   49088 command_runner.go:130] >               },
	I1202 19:07:23.152803   49088 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1202 19:07:23.152810   49088 command_runner.go:130] >             }
	I1202 19:07:23.152814   49088 command_runner.go:130] >           ],
	I1202 19:07:23.152828   49088 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1202 19:07:23.152835   49088 command_runner.go:130] >         },
	I1202 19:07:23.152840   49088 command_runner.go:130] >         "IFName": "lo"
	I1202 19:07:23.152847   49088 command_runner.go:130] >       }
	I1202 19:07:23.152850   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152855   49088 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1202 19:07:23.152860   49088 command_runner.go:130] >     "PluginDirs": [
	I1202 19:07:23.152865   49088 command_runner.go:130] >       "/opt/cni/bin"
	I1202 19:07:23.152869   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152873   49088 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1202 19:07:23.152879   49088 command_runner.go:130] >     "Prefix": "eth"
	I1202 19:07:23.152883   49088 command_runner.go:130] >   },
	I1202 19:07:23.152891   49088 command_runner.go:130] >   "config": {
	I1202 19:07:23.152894   49088 command_runner.go:130] >     "cdiSpecDirs": [
	I1202 19:07:23.152898   49088 command_runner.go:130] >       "/etc/cdi",
	I1202 19:07:23.152907   49088 command_runner.go:130] >       "/var/run/cdi"
	I1202 19:07:23.152910   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152917   49088 command_runner.go:130] >     "cni": {
	I1202 19:07:23.152921   49088 command_runner.go:130] >       "binDir": "",
	I1202 19:07:23.152928   49088 command_runner.go:130] >       "binDirs": [
	I1202 19:07:23.152933   49088 command_runner.go:130] >         "/opt/cni/bin"
	I1202 19:07:23.152936   49088 command_runner.go:130] >       ],
	I1202 19:07:23.152941   49088 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1202 19:07:23.152947   49088 command_runner.go:130] >       "confTemplate": "",
	I1202 19:07:23.152954   49088 command_runner.go:130] >       "ipPref": "",
	I1202 19:07:23.152958   49088 command_runner.go:130] >       "maxConfNum": 1,
	I1202 19:07:23.152963   49088 command_runner.go:130] >       "setupSerially": false,
	I1202 19:07:23.152969   49088 command_runner.go:130] >       "useInternalLoopback": false
	I1202 19:07:23.152977   49088 command_runner.go:130] >     },
	I1202 19:07:23.152983   49088 command_runner.go:130] >     "containerd": {
	I1202 19:07:23.152992   49088 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1202 19:07:23.152997   49088 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1202 19:07:23.153006   49088 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1202 19:07:23.153010   49088 command_runner.go:130] >       "runtimes": {
	I1202 19:07:23.153017   49088 command_runner.go:130] >         "runc": {
	I1202 19:07:23.153022   49088 command_runner.go:130] >           "ContainerAnnotations": null,
	I1202 19:07:23.153026   49088 command_runner.go:130] >           "PodAnnotations": null,
	I1202 19:07:23.153031   49088 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1202 19:07:23.153035   49088 command_runner.go:130] >           "cgroupWritable": false,
	I1202 19:07:23.153041   49088 command_runner.go:130] >           "cniConfDir": "",
	I1202 19:07:23.153046   49088 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1202 19:07:23.153053   49088 command_runner.go:130] >           "io_type": "",
	I1202 19:07:23.153058   49088 command_runner.go:130] >           "options": {
	I1202 19:07:23.153066   49088 command_runner.go:130] >             "BinaryName": "",
	I1202 19:07:23.153071   49088 command_runner.go:130] >             "CriuImagePath": "",
	I1202 19:07:23.153079   49088 command_runner.go:130] >             "CriuWorkPath": "",
	I1202 19:07:23.153083   49088 command_runner.go:130] >             "IoGid": 0,
	I1202 19:07:23.153091   49088 command_runner.go:130] >             "IoUid": 0,
	I1202 19:07:23.153096   49088 command_runner.go:130] >             "NoNewKeyring": false,
	I1202 19:07:23.153100   49088 command_runner.go:130] >             "Root": "",
	I1202 19:07:23.153104   49088 command_runner.go:130] >             "ShimCgroup": "",
	I1202 19:07:23.153111   49088 command_runner.go:130] >             "SystemdCgroup": false
	I1202 19:07:23.153115   49088 command_runner.go:130] >           },
	I1202 19:07:23.153120   49088 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1202 19:07:23.153128   49088 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1202 19:07:23.153136   49088 command_runner.go:130] >           "runtimePath": "",
	I1202 19:07:23.153143   49088 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1202 19:07:23.153237   49088 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1202 19:07:23.153375   49088 command_runner.go:130] >           "snapshotter": ""
	I1202 19:07:23.153385   49088 command_runner.go:130] >         }
	I1202 19:07:23.153389   49088 command_runner.go:130] >       }
	I1202 19:07:23.153393   49088 command_runner.go:130] >     },
	I1202 19:07:23.153414   49088 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1202 19:07:23.153424   49088 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1202 19:07:23.153435   49088 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1202 19:07:23.153444   49088 command_runner.go:130] >     "disableApparmor": false,
	I1202 19:07:23.153449   49088 command_runner.go:130] >     "disableHugetlbController": true,
	I1202 19:07:23.153457   49088 command_runner.go:130] >     "disableProcMount": false,
	I1202 19:07:23.153467   49088 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1202 19:07:23.153475   49088 command_runner.go:130] >     "enableCDI": true,
	I1202 19:07:23.153479   49088 command_runner.go:130] >     "enableSelinux": false,
	I1202 19:07:23.153484   49088 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1202 19:07:23.153490   49088 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1202 19:07:23.153500   49088 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1202 19:07:23.153508   49088 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1202 19:07:23.153516   49088 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1202 19:07:23.153522   49088 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1202 19:07:23.153534   49088 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1202 19:07:23.153544   49088 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1202 19:07:23.153549   49088 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1202 19:07:23.153562   49088 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1202 19:07:23.153570   49088 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1202 19:07:23.153575   49088 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1202 19:07:23.153578   49088 command_runner.go:130] >   },
	I1202 19:07:23.153582   49088 command_runner.go:130] >   "features": {
	I1202 19:07:23.153588   49088 command_runner.go:130] >     "supplemental_groups_policy": true
	I1202 19:07:23.153597   49088 command_runner.go:130] >   },
	I1202 19:07:23.153605   49088 command_runner.go:130] >   "golang": "go1.24.9",
	I1202 19:07:23.153615   49088 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1202 19:07:23.153633   49088 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1202 19:07:23.153644   49088 command_runner.go:130] >   "runtimeHandlers": [
	I1202 19:07:23.153649   49088 command_runner.go:130] >     {
	I1202 19:07:23.153658   49088 command_runner.go:130] >       "features": {
	I1202 19:07:23.153664   49088 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1202 19:07:23.153669   49088 command_runner.go:130] >         "user_namespaces": true
	I1202 19:07:23.153675   49088 command_runner.go:130] >       }
	I1202 19:07:23.153679   49088 command_runner.go:130] >     },
	I1202 19:07:23.153686   49088 command_runner.go:130] >     {
	I1202 19:07:23.153691   49088 command_runner.go:130] >       "features": {
	I1202 19:07:23.153703   49088 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1202 19:07:23.153708   49088 command_runner.go:130] >         "user_namespaces": true
	I1202 19:07:23.153715   49088 command_runner.go:130] >       },
	I1202 19:07:23.153720   49088 command_runner.go:130] >       "name": "runc"
	I1202 19:07:23.153727   49088 command_runner.go:130] >     }
	I1202 19:07:23.153731   49088 command_runner.go:130] >   ],
	I1202 19:07:23.153738   49088 command_runner.go:130] >   "status": {
	I1202 19:07:23.153742   49088 command_runner.go:130] >     "conditions": [
	I1202 19:07:23.153746   49088 command_runner.go:130] >       {
	I1202 19:07:23.153751   49088 command_runner.go:130] >         "message": "",
	I1202 19:07:23.153757   49088 command_runner.go:130] >         "reason": "",
	I1202 19:07:23.153766   49088 command_runner.go:130] >         "status": true,
	I1202 19:07:23.153774   49088 command_runner.go:130] >         "type": "RuntimeReady"
	I1202 19:07:23.153781   49088 command_runner.go:130] >       },
	I1202 19:07:23.153785   49088 command_runner.go:130] >       {
	I1202 19:07:23.153792   49088 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1202 19:07:23.153797   49088 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1202 19:07:23.153805   49088 command_runner.go:130] >         "status": false,
	I1202 19:07:23.153810   49088 command_runner.go:130] >         "type": "NetworkReady"
	I1202 19:07:23.153814   49088 command_runner.go:130] >       },
	I1202 19:07:23.153820   49088 command_runner.go:130] >       {
	I1202 19:07:23.153824   49088 command_runner.go:130] >         "message": "",
	I1202 19:07:23.153828   49088 command_runner.go:130] >         "reason": "",
	I1202 19:07:23.153836   49088 command_runner.go:130] >         "status": true,
	I1202 19:07:23.153850   49088 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1202 19:07:23.153857   49088 command_runner.go:130] >       }
	I1202 19:07:23.153861   49088 command_runner.go:130] >     ]
	I1202 19:07:23.153868   49088 command_runner.go:130] >   }
	I1202 19:07:23.153871   49088 command_runner.go:130] > }
	I1202 19:07:23.157283   49088 cni.go:84] Creating CNI manager for ""
	I1202 19:07:23.157307   49088 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:07:23.157324   49088 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 19:07:23.157352   49088 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-449836 NodeName:functional-449836 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 19:07:23.157503   49088 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-449836"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 19:07:23.157589   49088 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 19:07:23.165274   49088 command_runner.go:130] > kubeadm
	I1202 19:07:23.165296   49088 command_runner.go:130] > kubectl
	I1202 19:07:23.165301   49088 command_runner.go:130] > kubelet
	I1202 19:07:23.166244   49088 binaries.go:51] Found k8s binaries, skipping transfer
	I1202 19:07:23.166309   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 19:07:23.176520   49088 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1202 19:07:23.191534   49088 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 19:07:23.207596   49088 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1202 19:07:23.221899   49088 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1202 19:07:23.225538   49088 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1202 19:07:23.225972   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:23.344071   49088 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:07:24.171449   49088 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836 for IP: 192.168.49.2
	I1202 19:07:24.171473   49088 certs.go:195] generating shared ca certs ...
	I1202 19:07:24.171491   49088 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.171633   49088 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 19:07:24.171683   49088 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 19:07:24.171697   49088 certs.go:257] generating profile certs ...
	I1202 19:07:24.171794   49088 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key
	I1202 19:07:24.171860   49088 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da
	I1202 19:07:24.171905   49088 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key
	I1202 19:07:24.171916   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1202 19:07:24.171929   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1202 19:07:24.171946   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1202 19:07:24.171957   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1202 19:07:24.171972   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1202 19:07:24.171985   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1202 19:07:24.172001   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1202 19:07:24.172012   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1202 19:07:24.172062   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 19:07:24.172113   49088 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 19:07:24.172126   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 19:07:24.172154   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 19:07:24.172189   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 19:07:24.172215   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 19:07:24.172266   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:07:24.172298   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.172314   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.172347   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem -> /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.172878   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 19:07:24.192840   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 19:07:24.210709   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 19:07:24.228270   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 19:07:24.246519   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1202 19:07:24.264649   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 19:07:24.283289   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 19:07:24.302316   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 19:07:24.320907   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 19:07:24.338895   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 19:07:24.356995   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 19:07:24.374784   49088 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 19:07:24.388173   49088 ssh_runner.go:195] Run: openssl version
	I1202 19:07:24.394457   49088 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1202 19:07:24.394840   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 19:07:24.403512   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407229   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407385   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407455   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.448501   49088 command_runner.go:130] > 3ec20f2e
	I1202 19:07:24.448942   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 19:07:24.456981   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 19:07:24.465478   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469306   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469374   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469438   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.510270   49088 command_runner.go:130] > b5213941
	I1202 19:07:24.510784   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 19:07:24.518790   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 19:07:24.527001   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.530919   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.530959   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.531008   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.571727   49088 command_runner.go:130] > 51391683
	I1202 19:07:24.572161   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 19:07:24.580157   49088 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:07:24.584062   49088 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:07:24.584087   49088 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1202 19:07:24.584094   49088 command_runner.go:130] > Device: 259,1	Inode: 848916      Links: 1
	I1202 19:07:24.584101   49088 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1202 19:07:24.584108   49088 command_runner.go:130] > Access: 2025-12-02 19:03:16.577964732 +0000
	I1202 19:07:24.584114   49088 command_runner.go:130] > Modify: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584119   49088 command_runner.go:130] > Change: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584125   49088 command_runner.go:130] >  Birth: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584207   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1202 19:07:24.630311   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.630810   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1202 19:07:24.671995   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.672412   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1202 19:07:24.713648   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.713758   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1202 19:07:24.754977   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.755077   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1202 19:07:24.800995   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.801486   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1202 19:07:24.844718   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.845325   49088 kubeadm.go:401] StartCluster: {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:24.845410   49088 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 19:07:24.845499   49088 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:07:24.875465   49088 cri.go:89] found id: ""
	I1202 19:07:24.875565   49088 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 19:07:24.882887   49088 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1202 19:07:24.882908   49088 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1202 19:07:24.882928   49088 command_runner.go:130] > /var/lib/minikube/etcd:
	I1202 19:07:24.883961   49088 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1202 19:07:24.884012   49088 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1202 19:07:24.884084   49088 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1202 19:07:24.891632   49088 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:07:24.892026   49088 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-449836" does not appear in /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.892129   49088 kubeconfig.go:62] /home/jenkins/minikube-integration/22021-2487/kubeconfig needs updating (will repair): [kubeconfig missing "functional-449836" cluster setting kubeconfig missing "functional-449836" context setting]
	I1202 19:07:24.892546   49088 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.892988   49088 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.893140   49088 kapi.go:59] client config for functional-449836: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 19:07:24.893652   49088 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1202 19:07:24.893721   49088 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1202 19:07:24.893742   49088 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1202 19:07:24.893817   49088 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1202 19:07:24.893840   49088 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1202 19:07:24.893879   49088 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1202 19:07:24.894204   49088 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1202 19:07:24.902267   49088 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1202 19:07:24.902298   49088 kubeadm.go:602] duration metric: took 18.265587ms to restartPrimaryControlPlane
	I1202 19:07:24.902309   49088 kubeadm.go:403] duration metric: took 56.993765ms to StartCluster
	I1202 19:07:24.902355   49088 settings.go:142] acquiring lock: {Name:mka76ea0dcf16fdbb68808885f8360c0083029b4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.902437   49088 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.903036   49088 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.903251   49088 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1202 19:07:24.903573   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:24.903617   49088 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1202 19:07:24.903676   49088 addons.go:70] Setting storage-provisioner=true in profile "functional-449836"
	I1202 19:07:24.903691   49088 addons.go:239] Setting addon storage-provisioner=true in "functional-449836"
	I1202 19:07:24.903717   49088 host.go:66] Checking if "functional-449836" exists ...
	I1202 19:07:24.903830   49088 addons.go:70] Setting default-storageclass=true in profile "functional-449836"
	I1202 19:07:24.903877   49088 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-449836"
	I1202 19:07:24.904207   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.904250   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.909664   49088 out.go:179] * Verifying Kubernetes components...
	I1202 19:07:24.912752   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:24.942660   49088 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 19:07:24.943205   49088 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.943381   49088 kapi.go:59] client config for functional-449836: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 19:07:24.943666   49088 addons.go:239] Setting addon default-storageclass=true in "functional-449836"
	I1202 19:07:24.943695   49088 host.go:66] Checking if "functional-449836" exists ...
	I1202 19:07:24.944105   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.945588   49088 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:24.945617   49088 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1202 19:07:24.945676   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:24.976744   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:24.983018   49088 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:24.983040   49088 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1202 19:07:24.983109   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:25.013238   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:25.139303   49088 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:07:25.147308   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:25.166870   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:25.922715   49088 node_ready.go:35] waiting up to 6m0s for node "functional-449836" to be "Ready" ...
	I1202 19:07:25.922842   49088 type.go:168] "Request Body" body=""
	I1202 19:07:25.922904   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:25.923137   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:25.923161   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923181   49088 retry.go:31] will retry after 314.802872ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923212   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:25.923227   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923235   49088 retry.go:31] will retry after 316.161686ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923297   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:26.238968   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:26.239458   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:26.312262   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.312301   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.312346   49088 retry.go:31] will retry after 358.686092ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.320393   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.320484   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.320525   49088 retry.go:31] will retry after 528.121505ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.423804   49088 type.go:168] "Request Body" body=""
	I1202 19:07:26.423895   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:26.424214   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:26.671815   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:26.745439   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.745497   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.745515   49088 retry.go:31] will retry after 446.477413ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.849789   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:26.909069   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.909108   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.909134   49088 retry.go:31] will retry after 684.877567ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.923341   49088 type.go:168] "Request Body" body=""
	I1202 19:07:26.923433   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:26.923791   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:27.192236   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:27.247207   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:27.250502   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.250546   49088 retry.go:31] will retry after 797.707708ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.423790   49088 type.go:168] "Request Body" body=""
	I1202 19:07:27.423889   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:27.424196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:27.594774   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:27.660877   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:27.660957   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.660987   49088 retry.go:31] will retry after 601.48037ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.923401   49088 type.go:168] "Request Body" body=""
	I1202 19:07:27.923475   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:27.923784   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:27.923848   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:28.049160   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:28.112455   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:28.112493   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.112512   49088 retry.go:31] will retry after 941.564206ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.262919   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:28.323250   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:28.323307   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.323325   49088 retry.go:31] will retry after 741.834409ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.423555   49088 type.go:168] "Request Body" body=""
	I1202 19:07:28.423652   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:28.423971   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:28.923658   49088 type.go:168] "Request Body" body=""
	I1202 19:07:28.923731   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:28.923989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:29.054311   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:29.065740   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:29.126744   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:29.126791   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.126812   49088 retry.go:31] will retry after 2.378740888s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.143543   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:29.143609   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.143631   49088 retry.go:31] will retry after 2.739062704s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:07:29.423043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:29.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:29.923116   49088 type.go:168] "Request Body" body=""
	I1202 19:07:29.923203   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:29.923554   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:30.422925   49088 type.go:168] "Request Body" body=""
	I1202 19:07:30.423004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:30.423294   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:30.423351   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:30.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:07:30.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:30.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:07:31.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:31.423376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.506668   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:31.565098   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:31.565149   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.565168   49088 retry.go:31] will retry after 3.30231188s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.883619   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:31.923118   49088 type.go:168] "Request Body" body=""
	I1202 19:07:31.923190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:31.923483   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.949881   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:31.953682   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.953716   49088 retry.go:31] will retry after 2.323480137s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:32.422997   49088 type.go:168] "Request Body" body=""
	I1202 19:07:32.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:32.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:32.423441   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:32.923115   49088 type.go:168] "Request Body" body=""
	I1202 19:07:32.923193   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:32.923525   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:33.422891   49088 type.go:168] "Request Body" body=""
	I1202 19:07:33.422956   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:33.423209   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:33.922911   49088 type.go:168] "Request Body" body=""
	I1202 19:07:33.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:33.923302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:34.277557   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:34.337253   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:34.337306   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.337326   49088 retry.go:31] will retry after 5.941517157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.423656   49088 type.go:168] "Request Body" body=""
	I1202 19:07:34.423738   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:34.424084   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:34.424136   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:34.867735   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:34.923406   49088 type.go:168] "Request Body" body=""
	I1202 19:07:34.923506   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:34.923762   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:34.931582   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:34.931622   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.931641   49088 retry.go:31] will retry after 5.732328972s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:35.423000   49088 type.go:168] "Request Body" body=""
	I1202 19:07:35.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:35.423385   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:35.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:07:35.923063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:35.923444   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:36.422994   49088 type.go:168] "Request Body" body=""
	I1202 19:07:36.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:36.423390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:36.922999   49088 type.go:168] "Request Body" body=""
	I1202 19:07:36.923077   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:36.923397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:36.923453   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:37.423120   49088 type.go:168] "Request Body" body=""
	I1202 19:07:37.423196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:37.423525   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:37.923076   49088 type.go:168] "Request Body" body=""
	I1202 19:07:37.923148   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:37.923450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:38.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:07:38.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:38.423432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:38.923000   49088 type.go:168] "Request Body" body=""
	I1202 19:07:38.923074   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:38.923410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:39.423757   49088 type.go:168] "Request Body" body=""
	I1202 19:07:39.423827   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:39.424076   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:39.424115   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:39.923939   49088 type.go:168] "Request Body" body=""
	I1202 19:07:39.924013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:39.924357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:40.279081   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:40.340610   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:40.340655   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.340674   49088 retry.go:31] will retry after 7.832295728s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:07:40.423959   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:40.424241   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:40.664676   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:40.720825   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:40.724043   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.724077   49088 retry.go:31] will retry after 3.410570548s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.923400   49088 type.go:168] "Request Body" body=""
	I1202 19:07:40.923497   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:40.923882   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:41.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:07:41.423784   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:41.424115   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:41.424172   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:41.922915   49088 type.go:168] "Request Body" body=""
	I1202 19:07:41.922990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:41.923344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:42.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:07:42.422980   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:42.423254   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:42.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:07:42.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:42.923341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:43.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:07:43.423067   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:43.423429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:43.923715   49088 type.go:168] "Request Body" body=""
	I1202 19:07:43.923780   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:43.924087   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:43.924145   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:44.135480   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:44.194407   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:44.194462   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:44.194482   49088 retry.go:31] will retry after 9.43511002s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:44.423808   49088 type.go:168] "Request Body" body=""
	I1202 19:07:44.423884   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:44.424207   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:44.923173   49088 type.go:168] "Request Body" body=""
	I1202 19:07:44.923287   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:44.923608   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:45.423511   49088 type.go:168] "Request Body" body=""
	I1202 19:07:45.423594   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:45.423852   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:45.923682   49088 type.go:168] "Request Body" body=""
	I1202 19:07:45.923754   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:45.924062   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:46.423867   49088 type.go:168] "Request Body" body=""
	I1202 19:07:46.423945   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:46.424267   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:46.424344   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:46.923022   49088 type.go:168] "Request Body" body=""
	I1202 19:07:46.923087   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:46.923335   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:47.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:07:47.423049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:47.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:47.922938   49088 type.go:168] "Request Body" body=""
	I1202 19:07:47.923018   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:47.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:48.173817   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:48.233696   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:48.233741   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:48.233760   49088 retry.go:31] will retry after 11.915058211s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:48.423860   49088 type.go:168] "Request Body" body=""
	I1202 19:07:48.423931   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:48.424192   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:48.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:07:48.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:48.923338   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:48.923389   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:49.423071   49088 type.go:168] "Request Body" body=""
	I1202 19:07:49.423160   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:49.423457   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:49.923767   49088 type.go:168] "Request Body" body=""
	I1202 19:07:49.923839   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:49.924094   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:50.423628   49088 type.go:168] "Request Body" body=""
	I1202 19:07:50.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:50.424008   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:50.923823   49088 type.go:168] "Request Body" body=""
	I1202 19:07:50.923896   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:50.924199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:50.924253   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:51.423846   49088 type.go:168] "Request Body" body=""
	I1202 19:07:51.423915   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:51.424234   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:51.922982   49088 type.go:168] "Request Body" body=""
	I1202 19:07:51.923118   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:51.923450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:52.423137   49088 type.go:168] "Request Body" body=""
	I1202 19:07:52.423209   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:52.423568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:52.923553   49088 type.go:168] "Request Body" body=""
	I1202 19:07:52.923629   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:52.923890   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:53.423671   49088 type.go:168] "Request Body" body=""
	I1202 19:07:53.423751   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:53.424089   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:53.424151   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:53.630602   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:53.701195   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:53.708777   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:53.708825   49088 retry.go:31] will retry after 18.228322251s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:53.923261   49088 type.go:168] "Request Body" body=""
	I1202 19:07:53.923336   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:53.923674   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:54.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:07:54.422976   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:54.423235   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:54.923162   49088 type.go:168] "Request Body" body=""
	I1202 19:07:54.923249   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:54.923575   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:55.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:07:55.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:55.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:55.922912   49088 type.go:168] "Request Body" body=""
	I1202 19:07:55.922984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:55.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:55.923346   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:56.422984   49088 type.go:168] "Request Body" body=""
	I1202 19:07:56.423058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:56.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:56.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:07:56.923062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:56.923429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:57.423124   49088 type.go:168] "Request Body" body=""
	I1202 19:07:57.423196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:57.423456   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:57.922920   49088 type.go:168] "Request Body" body=""
	I1202 19:07:57.922991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:57.923317   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:57.923373   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:58.422950   49088 type.go:168] "Request Body" body=""
	I1202 19:07:58.423033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:58.423366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:58.923630   49088 type.go:168] "Request Body" body=""
	I1202 19:07:58.923696   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:58.924020   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:59.423807   49088 type.go:168] "Request Body" body=""
	I1202 19:07:59.423887   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:59.424243   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:59.922942   49088 type.go:168] "Request Body" body=""
	I1202 19:07:59.923022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:59.923353   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:59.923410   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:00.150075   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:00.323059   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:00.323111   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:00.323132   49088 retry.go:31] will retry after 12.256345503s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:00.423512   49088 type.go:168] "Request Body" body=""
	I1202 19:08:00.423597   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:00.423977   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:00.923784   49088 type.go:168] "Request Body" body=""
	I1202 19:08:00.923865   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:00.924196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:01.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:08:01.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:01.423304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:01.922933   49088 type.go:168] "Request Body" body=""
	I1202 19:08:01.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:01.923287   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:02.422978   49088 type.go:168] "Request Body" body=""
	I1202 19:08:02.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:02.423379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:02.423436   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:02.923122   49088 type.go:168] "Request Body" body=""
	I1202 19:08:02.923245   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:02.923555   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:03.423814   49088 type.go:168] "Request Body" body=""
	I1202 19:08:03.423889   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:03.424141   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:03.923895   49088 type.go:168] "Request Body" body=""
	I1202 19:08:03.923996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:03.924288   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:04.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:08:04.423083   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:04.423375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:04.922988   49088 type.go:168] "Request Body" body=""
	I1202 19:08:04.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:04.923336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:04.923376   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:05.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:08:05.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:05.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:05.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:08:05.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:05.923359   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:06.423783   49088 type.go:168] "Request Body" body=""
	I1202 19:08:06.423854   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:06.424112   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:06.923877   49088 type.go:168] "Request Body" body=""
	I1202 19:08:06.923974   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:06.924303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:06.924381   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:07.423044   49088 type.go:168] "Request Body" body=""
	I1202 19:08:07.423125   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:07.423474   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:07.922856   49088 type.go:168] "Request Body" body=""
	I1202 19:08:07.922930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:07.923205   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:08.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:08.422992   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:08.423315   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:08.922886   49088 type.go:168] "Request Body" body=""
	I1202 19:08:08.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:08.923313   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:09.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:09.423006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:09.423295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:09.423343   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:09.922894   49088 type.go:168] "Request Body" body=""
	I1202 19:08:09.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:09.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:10.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:08:10.423153   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:10.423491   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:10.923741   49088 type.go:168] "Request Body" body=""
	I1202 19:08:10.923814   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:10.924073   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:11.423834   49088 type.go:168] "Request Body" body=""
	I1202 19:08:11.423907   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:11.424248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:11.424304   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:11.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:08:11.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:11.923342   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:11.937687   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:08:11.996748   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:11.999800   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:11.999828   49088 retry.go:31] will retry after 12.016513449s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.423502   49088 type.go:168] "Request Body" body=""
	I1202 19:08:12.423582   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:12.423831   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:12.580354   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:12.637408   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:12.637456   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.637477   49088 retry.go:31] will retry after 30.215930355s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.923948   49088 type.go:168] "Request Body" body=""
	I1202 19:08:12.924043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:12.924384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:13.422995   49088 type.go:168] "Request Body" body=""
	I1202 19:08:13.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:13.423402   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:13.923854   49088 type.go:168] "Request Body" body=""
	I1202 19:08:13.923924   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:13.924172   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:13.924221   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:14.422931   49088 type.go:168] "Request Body" body=""
	I1202 19:08:14.423013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:14.423360   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:14.923106   49088 type.go:168] "Request Body" body=""
	I1202 19:08:14.923201   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:14.923504   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:15.423455   49088 type.go:168] "Request Body" body=""
	I1202 19:08:15.423543   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:15.423801   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:15.923582   49088 type.go:168] "Request Body" body=""
	I1202 19:08:15.923658   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:15.923982   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:16.423696   49088 type.go:168] "Request Body" body=""
	I1202 19:08:16.423768   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:16.424069   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:16.424123   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:16.923441   49088 type.go:168] "Request Body" body=""
	I1202 19:08:16.923513   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:16.923823   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:17.423623   49088 type.go:168] "Request Body" body=""
	I1202 19:08:17.423715   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:17.424059   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:17.923916   49088 type.go:168] "Request Body" body=""
	I1202 19:08:17.923987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:17.924303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:18.422927   49088 type.go:168] "Request Body" body=""
	I1202 19:08:18.423013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:18.423293   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:18.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:08:18.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:18.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:18.923511   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:19.423204   49088 type.go:168] "Request Body" body=""
	I1202 19:08:19.423280   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:19.423633   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:19.923322   49088 type.go:168] "Request Body" body=""
	I1202 19:08:19.923392   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:19.923647   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:20.423776   49088 type.go:168] "Request Body" body=""
	I1202 19:08:20.423870   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:20.424201   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:20.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:08:20.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:20.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:21.423693   49088 type.go:168] "Request Body" body=""
	I1202 19:08:21.423801   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:21.424068   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:21.424117   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:21.923863   49088 type.go:168] "Request Body" body=""
	I1202 19:08:21.923935   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:21.924262   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:22.422993   49088 type.go:168] "Request Body" body=""
	I1202 19:08:22.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:22.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:22.922927   49088 type.go:168] "Request Body" body=""
	I1202 19:08:22.922998   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:22.923323   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:23.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:08:23.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:23.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:23.922971   49088 type.go:168] "Request Body" body=""
	I1202 19:08:23.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:23.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:23.923391   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:24.016567   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:08:24.078750   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:24.078790   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:24.078809   49088 retry.go:31] will retry after 37.473532818s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:24.423149   49088 type.go:168] "Request Body" body=""
	I1202 19:08:24.423225   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:24.423606   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:24.923585   49088 type.go:168] "Request Body" body=""
	I1202 19:08:24.923686   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:24.924015   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:25.423855   49088 type.go:168] "Request Body" body=""
	I1202 19:08:25.423933   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:25.424248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:25.923542   49088 type.go:168] "Request Body" body=""
	I1202 19:08:25.923615   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:25.923871   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:25.923923   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:26.423702   49088 type.go:168] "Request Body" body=""
	I1202 19:08:26.423799   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:26.424100   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:26.923908   49088 type.go:168] "Request Body" body=""
	I1202 19:08:26.923990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:26.924357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:27.422916   49088 type.go:168] "Request Body" body=""
	I1202 19:08:27.423000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:27.423295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:27.922995   49088 type.go:168] "Request Body" body=""
	I1202 19:08:27.923085   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:27.923386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:28.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:08:28.423198   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:28.423550   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:28.423605   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:28.923207   49088 type.go:168] "Request Body" body=""
	I1202 19:08:28.923280   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:28.923547   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:29.423229   49088 type.go:168] "Request Body" body=""
	I1202 19:08:29.423310   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:29.423621   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:29.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:08:29.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:29.923334   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:30.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:08:30.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:30.423290   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:30.922992   49088 type.go:168] "Request Body" body=""
	I1202 19:08:30.923070   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:30.923374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:30.923423   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:31.422962   49088 type.go:168] "Request Body" body=""
	I1202 19:08:31.423044   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:31.423370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:31.923658   49088 type.go:168] "Request Body" body=""
	I1202 19:08:31.923727   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:31.923984   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:32.423783   49088 type.go:168] "Request Body" body=""
	I1202 19:08:32.423853   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:32.424194   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:32.923864   49088 type.go:168] "Request Body" body=""
	I1202 19:08:32.923952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:32.924274   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:32.924361   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:33.422870   49088 type.go:168] "Request Body" body=""
	I1202 19:08:33.422935   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:33.423233   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:33.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:08:33.923021   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:33.923340   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:34.423053   49088 type.go:168] "Request Body" body=""
	I1202 19:08:34.423137   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:34.423440   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:34.923286   49088 type.go:168] "Request Body" body=""
	I1202 19:08:34.923353   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:34.923610   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:35.423738   49088 type.go:168] "Request Body" body=""
	I1202 19:08:35.423811   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:35.424122   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:35.424178   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:35.923982   49088 type.go:168] "Request Body" body=""
	I1202 19:08:35.924054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:35.924397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:36.423490   49088 type.go:168] "Request Body" body=""
	I1202 19:08:36.423577   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:36.423904   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:36.923609   49088 type.go:168] "Request Body" body=""
	I1202 19:08:36.923698   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:36.924003   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:37.423836   49088 type.go:168] "Request Body" body=""
	I1202 19:08:37.423908   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:37.424273   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:37.424347   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:37.923878   49088 type.go:168] "Request Body" body=""
	I1202 19:08:37.923949   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:37.924222   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:38.422922   49088 type.go:168] "Request Body" body=""
	I1202 19:08:38.422995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:38.423329   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:38.923060   49088 type.go:168] "Request Body" body=""
	I1202 19:08:38.923133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:38.923451   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:39.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:08:39.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:39.423240   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:39.922980   49088 type.go:168] "Request Body" body=""
	I1202 19:08:39.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:39.923354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:39.923403   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:40.423331   49088 type.go:168] "Request Body" body=""
	I1202 19:08:40.423423   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:40.423754   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:40.923541   49088 type.go:168] "Request Body" body=""
	I1202 19:08:40.923652   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:40.923952   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:41.423758   49088 type.go:168] "Request Body" body=""
	I1202 19:08:41.423829   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:41.424159   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:41.922904   49088 type.go:168] "Request Body" body=""
	I1202 19:08:41.922987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:41.923363   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:42.423568   49088 type.go:168] "Request Body" body=""
	I1202 19:08:42.423637   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:42.423879   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:42.423921   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:42.854609   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:42.913285   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:42.916268   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:42.916300   49088 retry.go:31] will retry after 24.794449401s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:42.923470   49088 type.go:168] "Request Body" body=""
	I1202 19:08:42.923553   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:42.923860   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:43.423622   49088 type.go:168] "Request Body" body=""
	I1202 19:08:43.423694   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:43.423983   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:43.923751   49088 type.go:168] "Request Body" body=""
	I1202 19:08:43.923834   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:43.924123   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:44.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:44.423014   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:44.423327   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:44.923006   49088 type.go:168] "Request Body" body=""
	I1202 19:08:44.923080   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:44.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:44.923476   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:45.422929   49088 type.go:168] "Request Body" body=""
	I1202 19:08:45.423000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:45.423274   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:45.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:08:45.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:45.923364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:46.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:08:46.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:46.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:46.922941   49088 type.go:168] "Request Body" body=""
	I1202 19:08:46.923013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:46.923277   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:47.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:08:47.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:47.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:47.423440   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:47.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:08:47.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:47.923383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:48.423521   49088 type.go:168] "Request Body" body=""
	I1202 19:08:48.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:48.424010   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:48.923782   49088 type.go:168] "Request Body" body=""
	I1202 19:08:48.923856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:48.924186   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:49.422919   49088 type.go:168] "Request Body" body=""
	I1202 19:08:49.422992   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:49.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:49.922931   49088 type.go:168] "Request Body" body=""
	I1202 19:08:49.923004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:49.923306   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:49.923362   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:50.423001   49088 type.go:168] "Request Body" body=""
	I1202 19:08:50.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:50.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:50.922993   49088 type.go:168] "Request Body" body=""
	I1202 19:08:50.923072   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:50.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:51.423082   49088 type.go:168] "Request Body" body=""
	I1202 19:08:51.423174   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:51.423497   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:51.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:08:51.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:51.923335   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:51.923382   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:52.423062   49088 type.go:168] "Request Body" body=""
	I1202 19:08:52.423141   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:52.423469   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:52.922906   49088 type.go:168] "Request Body" body=""
	I1202 19:08:52.922982   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:52.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:53.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:08:53.423074   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:53.423405   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:53.923177   49088 type.go:168] "Request Body" body=""
	I1202 19:08:53.923253   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:53.923577   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:53.923645   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:54.422880   49088 type.go:168] "Request Body" body=""
	I1202 19:08:54.422958   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:54.423220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:54.923069   49088 type.go:168] "Request Body" body=""
	I1202 19:08:54.923142   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:54.923466   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:55.423373   49088 type.go:168] "Request Body" body=""
	I1202 19:08:55.423459   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:55.423806   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:55.923603   49088 type.go:168] "Request Body" body=""
	I1202 19:08:55.923681   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:55.923944   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:55.923992   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:56.423750   49088 type.go:168] "Request Body" body=""
	I1202 19:08:56.423821   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:56.424196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:56.922939   49088 type.go:168] "Request Body" body=""
	I1202 19:08:56.923015   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:56.923399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:57.423718   49088 type.go:168] "Request Body" body=""
	I1202 19:08:57.423789   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:57.424085   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:57.923903   49088 type.go:168] "Request Body" body=""
	I1202 19:08:57.923980   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:57.924302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:57.924374   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:58.422958   49088 type.go:168] "Request Body" body=""
	I1202 19:08:58.423030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:58.423367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:58.923777   49088 type.go:168] "Request Body" body=""
	I1202 19:08:58.923851   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:58.924127   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:59.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:08:59.423956   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:59.424305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:59.922904   49088 type.go:168] "Request Body" body=""
	I1202 19:08:59.922978   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:59.923298   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:00.423245   49088 type.go:168] "Request Body" body=""
	I1202 19:09:00.423318   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:00.423619   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:00.423665   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:00.922984   49088 type.go:168] "Request Body" body=""
	I1202 19:09:00.923063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:00.923384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:01.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:09:01.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:01.423390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:01.552630   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:09:01.616821   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:01.616872   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:01.616967   49088 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1202 19:09:01.923268   49088 type.go:168] "Request Body" body=""
	I1202 19:09:01.923333   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:01.923595   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:02.423036   49088 type.go:168] "Request Body" body=""
	I1202 19:09:02.423106   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:02.423425   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:02.922957   49088 type.go:168] "Request Body" body=""
	I1202 19:09:02.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:02.923349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:02.923411   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:03.422870   49088 type.go:168] "Request Body" body=""
	I1202 19:09:03.422937   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:03.423202   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:03.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:03.923048   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:03.923382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:04.422966   49088 type.go:168] "Request Body" body=""
	I1202 19:09:04.423045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:04.423360   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:04.923022   49088 type.go:168] "Request Body" body=""
	I1202 19:09:04.923097   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:04.923357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:05.423319   49088 type.go:168] "Request Body" body=""
	I1202 19:09:05.423392   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:05.423740   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:05.423793   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:05.923326   49088 type.go:168] "Request Body" body=""
	I1202 19:09:05.923409   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:05.923718   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:06.423454   49088 type.go:168] "Request Body" body=""
	I1202 19:09:06.423525   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:06.423826   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:06.923640   49088 type.go:168] "Request Body" body=""
	I1202 19:09:06.923716   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:06.924092   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:07.423755   49088 type.go:168] "Request Body" body=""
	I1202 19:09:07.423833   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:07.424174   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:07.424240   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:07.711667   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:09:07.768083   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:07.771273   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:07.771371   49088 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1202 19:09:07.774489   49088 out.go:179] * Enabled addons: 
	I1202 19:09:07.778178   49088 addons.go:530] duration metric: took 1m42.874553995s for enable addons: enabled=[]
	I1202 19:09:07.923592   49088 type.go:168] "Request Body" body=""
	I1202 19:09:07.923663   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:07.923975   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:08.423753   49088 type.go:168] "Request Body" body=""
	I1202 19:09:08.423867   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:08.424222   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:08.922931   49088 type.go:168] "Request Body" body=""
	I1202 19:09:08.923003   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:08.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:09.423790   49088 type.go:168] "Request Body" body=""
	I1202 19:09:09.423880   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:09.424153   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:09.922907   49088 type.go:168] "Request Body" body=""
	I1202 19:09:09.923001   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:09.923324   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:09.923374   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:10.423145   49088 type.go:168] "Request Body" body=""
	I1202 19:09:10.423260   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:10.423579   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:10.922932   49088 type.go:168] "Request Body" body=""
	I1202 19:09:10.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:10.923400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:11.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:09:11.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:11.423397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:11.923082   49088 type.go:168] "Request Body" body=""
	I1202 19:09:11.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:11.923464   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:11.923521   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:12.422896   49088 type.go:168] "Request Body" body=""
	I1202 19:09:12.422975   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:12.423250   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:12.922964   49088 type.go:168] "Request Body" body=""
	I1202 19:09:12.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:12.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:13.423093   49088 type.go:168] "Request Body" body=""
	I1202 19:09:13.423175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:13.423500   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:13.923207   49088 type.go:168] "Request Body" body=""
	I1202 19:09:13.923272   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:13.923535   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:13.923574   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:14.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:14.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:14.423377   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:14.923293   49088 type.go:168] "Request Body" body=""
	I1202 19:09:14.923367   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:14.923688   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:15.423514   49088 type.go:168] "Request Body" body=""
	I1202 19:09:15.423584   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:15.423882   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:15.923633   49088 type.go:168] "Request Body" body=""
	I1202 19:09:15.923702   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:15.924013   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:15.924083   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:16.423892   49088 type.go:168] "Request Body" body=""
	I1202 19:09:16.423994   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:16.424346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:16.922929   49088 type.go:168] "Request Body" body=""
	I1202 19:09:16.922996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:16.923246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:17.422959   49088 type.go:168] "Request Body" body=""
	I1202 19:09:17.423030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:17.423344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:17.923012   49088 type.go:168] "Request Body" body=""
	I1202 19:09:17.923112   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:17.923445   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:18.423579   49088 type.go:168] "Request Body" body=""
	I1202 19:09:18.423646   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:18.423912   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:18.423954   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:18.923689   49088 type.go:168] "Request Body" body=""
	I1202 19:09:18.923816   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:18.924164   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:19.423865   49088 type.go:168] "Request Body" body=""
	I1202 19:09:19.423938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:19.424264   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:19.922971   49088 type.go:168] "Request Body" body=""
	I1202 19:09:19.923065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:19.928773   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1202 19:09:20.423719   49088 type.go:168] "Request Body" body=""
	I1202 19:09:20.423798   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:20.424108   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:20.424158   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:20.923797   49088 type.go:168] "Request Body" body=""
	I1202 19:09:20.923876   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:20.924234   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:21.423890   49088 type.go:168] "Request Body" body=""
	I1202 19:09:21.423990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:21.424292   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:21.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:09:21.923044   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:21.923382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:22.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:22.423043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:22.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:22.923067   49088 type.go:168] "Request Body" body=""
	I1202 19:09:22.923150   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:22.923449   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:22.923501   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:23.423230   49088 type.go:168] "Request Body" body=""
	I1202 19:09:23.423312   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:23.423745   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:23.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:23.923037   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:23.923364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:24.423610   49088 type.go:168] "Request Body" body=""
	I1202 19:09:24.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:24.423973   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:24.922875   49088 type.go:168] "Request Body" body=""
	I1202 19:09:24.922950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:24.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:25.422971   49088 type.go:168] "Request Body" body=""
	I1202 19:09:25.423045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:25.423397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:25.423453   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:25.923781   49088 type.go:168] "Request Body" body=""
	I1202 19:09:25.923849   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:25.924111   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:26.423856   49088 type.go:168] "Request Body" body=""
	I1202 19:09:26.423928   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:26.424242   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:26.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:26.923039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:26.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:27.423069   49088 type.go:168] "Request Body" body=""
	I1202 19:09:27.423144   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:27.423407   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:27.922946   49088 type.go:168] "Request Body" body=""
	I1202 19:09:27.923018   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:27.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:27.923411   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:28.423076   49088 type.go:168] "Request Body" body=""
	I1202 19:09:28.423152   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:28.423495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:28.923840   49088 type.go:168] "Request Body" body=""
	I1202 19:09:28.923913   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:28.924219   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:29.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:09:29.422989   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:29.423326   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:29.923042   49088 type.go:168] "Request Body" body=""
	I1202 19:09:29.923119   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:29.923480   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:29.923538   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:30.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:09:30.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:30.423371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:30.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:30.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:30.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:31.423083   49088 type.go:168] "Request Body" body=""
	I1202 19:09:31.423155   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:31.423507   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:31.922881   49088 type.go:168] "Request Body" body=""
	I1202 19:09:31.922954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:31.923312   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:32.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:09:32.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:32.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:32.423472   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:32.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:09:32.923050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:32.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:33.423100   49088 type.go:168] "Request Body" body=""
	I1202 19:09:33.423172   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:33.423473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:33.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:09:33.923039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:33.923379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:34.423096   49088 type.go:168] "Request Body" body=""
	I1202 19:09:34.423177   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:34.423484   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:34.423532   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:34.923381   49088 type.go:168] "Request Body" body=""
	I1202 19:09:34.923452   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:34.923763   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:35.423619   49088 type.go:168] "Request Body" body=""
	I1202 19:09:35.423698   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:35.424059   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:35.923863   49088 type.go:168] "Request Body" body=""
	I1202 19:09:35.923933   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:35.924297   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:36.423817   49088 type.go:168] "Request Body" body=""
	I1202 19:09:36.423883   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:36.424153   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:36.424193   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:36.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:09:36.922964   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:36.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:37.422984   49088 type.go:168] "Request Body" body=""
	I1202 19:09:37.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:37.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:37.922914   49088 type.go:168] "Request Body" body=""
	I1202 19:09:37.922987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:37.923253   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:38.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:09:38.423020   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:38.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:38.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:38.923084   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:38.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:38.923459   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:39.423118   49088 type.go:168] "Request Body" body=""
	I1202 19:09:39.423188   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:39.423443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:39.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:09:39.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:39.923390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:40.422957   49088 type.go:168] "Request Body" body=""
	I1202 19:09:40.423031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:40.423438   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:40.922909   49088 type.go:168] "Request Body" body=""
	I1202 19:09:40.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:40.923328   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:41.422954   49088 type.go:168] "Request Body" body=""
	I1202 19:09:41.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:41.423387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:41.423450   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:41.923108   49088 type.go:168] "Request Body" body=""
	I1202 19:09:41.923187   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:41.923536   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:42.423214   49088 type.go:168] "Request Body" body=""
	I1202 19:09:42.423293   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:42.423567   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:42.922993   49088 type.go:168] "Request Body" body=""
	I1202 19:09:42.923073   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:42.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:43.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:09:43.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:43.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:43.923097   49088 type.go:168] "Request Body" body=""
	I1202 19:09:43.923183   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:43.923554   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:43.923610   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:44.422960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:44.423031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:44.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:44.923133   49088 type.go:168] "Request Body" body=""
	I1202 19:09:44.923217   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:44.923568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:45.422996   49088 type.go:168] "Request Body" body=""
	I1202 19:09:45.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:45.423325   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:45.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:09:45.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:45.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:46.422965   49088 type.go:168] "Request Body" body=""
	I1202 19:09:46.423041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:46.423338   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:46.423383   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:46.923662   49088 type.go:168] "Request Body" body=""
	I1202 19:09:46.923729   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:46.923996   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:47.423794   49088 type.go:168] "Request Body" body=""
	I1202 19:09:47.423868   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:47.424199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:47.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:09:47.922970   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:47.923290   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:48.423862   49088 type.go:168] "Request Body" body=""
	I1202 19:09:48.423930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:48.424197   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:48.424242   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:48.922882   49088 type.go:168] "Request Body" body=""
	I1202 19:09:48.922964   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:48.923305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:49.422874   49088 type.go:168] "Request Body" body=""
	I1202 19:09:49.422952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:49.423501   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:49.923227   49088 type.go:168] "Request Body" body=""
	I1202 19:09:49.923298   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:49.923571   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:50.423530   49088 type.go:168] "Request Body" body=""
	I1202 19:09:50.423605   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:50.423930   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:50.923709   49088 type.go:168] "Request Body" body=""
	I1202 19:09:50.923791   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:50.924129   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:50.924183   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:51.423574   49088 type.go:168] "Request Body" body=""
	I1202 19:09:51.423645   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:51.423989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:51.923773   49088 type.go:168] "Request Body" body=""
	I1202 19:09:51.923846   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:51.924175   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:52.423792   49088 type.go:168] "Request Body" body=""
	I1202 19:09:52.423865   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:52.424194   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:52.923791   49088 type.go:168] "Request Body" body=""
	I1202 19:09:52.923863   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:52.924133   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:53.423850   49088 type.go:168] "Request Body" body=""
	I1202 19:09:53.423929   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:53.424252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:53.424366   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:53.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:09:53.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:53.923376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:54.422922   49088 type.go:168] "Request Body" body=""
	I1202 19:09:54.422999   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:54.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:54.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:09:54.923175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:54.923548   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:55.422950   49088 type.go:168] "Request Body" body=""
	I1202 19:09:55.423041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:55.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:55.923676   49088 type.go:168] "Request Body" body=""
	I1202 19:09:55.923766   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:55.924024   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:55.924066   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:56.423823   49088 type.go:168] "Request Body" body=""
	I1202 19:09:56.423900   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:56.424217   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:56.922947   49088 type.go:168] "Request Body" body=""
	I1202 19:09:56.923022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:56.923366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:57.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:09:57.422984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:57.423240   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:57.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:09:57.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:57.923361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:58.422959   49088 type.go:168] "Request Body" body=""
	I1202 19:09:58.423033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:58.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:58.423443   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:58.923738   49088 type.go:168] "Request Body" body=""
	I1202 19:09:58.923812   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:58.924072   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:59.423859   49088 type.go:168] "Request Body" body=""
	I1202 19:09:59.423937   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:59.424270   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:59.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:09:59.923052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:59.923398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:00.435478   49088 type.go:168] "Request Body" body=""
	I1202 19:10:00.435562   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:00.435862   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:00.435913   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:00.923635   49088 type.go:168] "Request Body" body=""
	I1202 19:10:00.923715   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:00.924056   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:01.423705   49088 type.go:168] "Request Body" body=""
	I1202 19:10:01.423779   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:01.424081   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:01.923812   49088 type.go:168] "Request Body" body=""
	I1202 19:10:01.923878   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:01.924156   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:02.423889   49088 type.go:168] "Request Body" body=""
	I1202 19:10:02.423969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:02.424269   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:02.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:02.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:02.923443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:02.923500   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:03.423151   49088 type.go:168] "Request Body" body=""
	I1202 19:10:03.423226   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:03.423486   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:03.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:10:03.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:03.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:04.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:10:04.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:04.423412   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:04.923351   49088 type.go:168] "Request Body" body=""
	I1202 19:10:04.923425   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:04.923691   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:04.923736   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:05.423841   49088 type.go:168] "Request Body" body=""
	I1202 19:10:05.423932   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:05.424309   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:05.922992   49088 type.go:168] "Request Body" body=""
	I1202 19:10:05.923078   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:05.923444   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:06.423123   49088 type.go:168] "Request Body" body=""
	I1202 19:10:06.423232   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:06.423495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:06.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:10:06.923050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:06.923408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:07.422988   49088 type.go:168] "Request Body" body=""
	I1202 19:10:07.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:07.423489   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:07.423547   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:07.923028   49088 type.go:168] "Request Body" body=""
	I1202 19:10:07.923107   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:07.923442   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:08.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:08.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:08.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:08.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:10:08.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:08.923365   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:09.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:10:09.423094   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:09.423439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:09.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:10:09.923062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:09.923410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:09.923465   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:10.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:10:10.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:10.423387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:10.922955   49088 type.go:168] "Request Body" body=""
	I1202 19:10:10.923025   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:10.923302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:11.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:11.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:11.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:11.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:10:11.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:11.923379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:12.422929   49088 type.go:168] "Request Body" body=""
	I1202 19:10:12.423003   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:12.423285   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:12.423327   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:12.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:10:12.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:12.923388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:13.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:13.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:13.423417   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:13.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:10:13.922997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:13.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:14.422948   49088 type.go:168] "Request Body" body=""
	I1202 19:10:14.423026   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:14.423361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:14.423418   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:14.923139   49088 type.go:168] "Request Body" body=""
	I1202 19:10:14.923221   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:14.923551   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:15.423337   49088 type.go:168] "Request Body" body=""
	I1202 19:10:15.423420   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:15.423733   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:15.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:10:15.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:15.923380   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:16.422956   49088 type.go:168] "Request Body" body=""
	I1202 19:10:16.423036   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:16.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:16.923752   49088 type.go:168] "Request Body" body=""
	I1202 19:10:16.923818   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:16.924073   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:16.924112   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:17.423826   49088 type.go:168] "Request Body" body=""
	I1202 19:10:17.423915   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:17.424256   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:17.922988   49088 type.go:168] "Request Body" body=""
	I1202 19:10:17.923068   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:17.923403   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:18.422876   49088 type.go:168] "Request Body" body=""
	I1202 19:10:18.422953   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:18.423220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:18.922913   49088 type.go:168] "Request Body" body=""
	I1202 19:10:18.922988   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:18.923326   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:19.423037   49088 type.go:168] "Request Body" body=""
	I1202 19:10:19.423111   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:19.423450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:19.423505   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:19.923780   49088 type.go:168] "Request Body" body=""
	I1202 19:10:19.923847   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:19.924112   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:20.423105   49088 type.go:168] "Request Body" body=""
	I1202 19:10:20.423212   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:20.423516   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:20.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:10:20.923060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:20.923378   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:21.423065   49088 type.go:168] "Request Body" body=""
	I1202 19:10:21.423140   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:21.423414   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:21.923020   49088 type.go:168] "Request Body" body=""
	I1202 19:10:21.923093   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:21.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:21.923415   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:22.423093   49088 type.go:168] "Request Body" body=""
	I1202 19:10:22.423166   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:22.423511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:22.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:10:22.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:22.923434   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:23.423060   49088 type.go:168] "Request Body" body=""
	I1202 19:10:23.423133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:23.423446   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:23.923195   49088 type.go:168] "Request Body" body=""
	I1202 19:10:23.923269   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:23.923577   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:23.923622   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:24.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:10:24.422957   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:24.423230   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:24.923115   49088 type.go:168] "Request Body" body=""
	I1202 19:10:24.923194   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:24.923600   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:25.423537   49088 type.go:168] "Request Body" body=""
	I1202 19:10:25.423610   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:25.423949   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:25.923703   49088 type.go:168] "Request Body" body=""
	I1202 19:10:25.923775   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:25.924043   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:25.924103   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:26.423902   49088 type.go:168] "Request Body" body=""
	I1202 19:10:26.423982   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:26.424292   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:26.922962   49088 type.go:168] "Request Body" body=""
	I1202 19:10:26.923073   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:26.923396   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:27.423706   49088 type.go:168] "Request Body" body=""
	I1202 19:10:27.423778   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:27.424090   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:27.923873   49088 type.go:168] "Request Body" body=""
	I1202 19:10:27.923954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:27.924307   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:27.924397   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:28.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:10:28.423017   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:28.423349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:28.922918   49088 type.go:168] "Request Body" body=""
	I1202 19:10:28.922988   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:28.923270   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:29.422990   49088 type.go:168] "Request Body" body=""
	I1202 19:10:29.423072   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:29.423426   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:29.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:10:29.923053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:29.923386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:30.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:10:30.423008   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:30.423306   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:30.423392   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:30.923054   49088 type.go:168] "Request Body" body=""
	I1202 19:10:30.923133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:30.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:31.423123   49088 type.go:168] "Request Body" body=""
	I1202 19:10:31.423203   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:31.423539   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:31.923853   49088 type.go:168] "Request Body" body=""
	I1202 19:10:31.923920   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:31.924180   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:32.422889   49088 type.go:168] "Request Body" body=""
	I1202 19:10:32.422970   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:32.423319   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:32.922910   49088 type.go:168] "Request Body" body=""
	I1202 19:10:32.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:32.923321   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:32.923375   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:33.423014   49088 type.go:168] "Request Body" body=""
	I1202 19:10:33.423095   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:33.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:33.923072   49088 type.go:168] "Request Body" body=""
	I1202 19:10:33.923148   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:33.923513   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:34.423108   49088 type.go:168] "Request Body" body=""
	I1202 19:10:34.423190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:34.423541   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:34.923286   49088 type.go:168] "Request Body" body=""
	I1202 19:10:34.923355   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:34.923630   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:34.923672   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:35.423752   49088 type.go:168] "Request Body" body=""
	I1202 19:10:35.423834   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:35.424190   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:35.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:10:35.922967   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:35.923295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:36.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:10:36.423054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:36.423305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:36.923037   49088 type.go:168] "Request Body" body=""
	I1202 19:10:36.923115   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:36.923442   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:37.423029   49088 type.go:168] "Request Body" body=""
	I1202 19:10:37.423102   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:37.423425   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:37.423480   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:37.923773   49088 type.go:168] "Request Body" body=""
	I1202 19:10:37.923861   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:37.924136   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:38.423899   49088 type.go:168] "Request Body" body=""
	I1202 19:10:38.423979   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:38.424296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:38.922970   49088 type.go:168] "Request Body" body=""
	I1202 19:10:38.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:38.923372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:39.423713   49088 type.go:168] "Request Body" body=""
	I1202 19:10:39.423780   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:39.424040   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:39.424081   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:39.923835   49088 type.go:168] "Request Body" body=""
	I1202 19:10:39.923908   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:39.924227   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:40.423209   49088 type.go:168] "Request Body" body=""
	I1202 19:10:40.423286   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:40.423612   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:40.923000   49088 type.go:168] "Request Body" body=""
	I1202 19:10:40.923083   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:40.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:41.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:41.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:41.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:41.922974   49088 type.go:168] "Request Body" body=""
	I1202 19:10:41.923048   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:41.923367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:41.923430   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:42.422892   49088 type.go:168] "Request Body" body=""
	I1202 19:10:42.422960   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:42.423218   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:42.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:10:42.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:42.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:43.422967   49088 type.go:168] "Request Body" body=""
	I1202 19:10:43.423039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:43.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:43.922911   49088 type.go:168] "Request Body" body=""
	I1202 19:10:43.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:43.923286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:44.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:44.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:44.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:44.423441   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:44.923165   49088 type.go:168] "Request Body" body=""
	I1202 19:10:44.923245   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:44.923572   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:45.423613   49088 type.go:168] "Request Body" body=""
	I1202 19:10:45.423695   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:45.423958   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:45.924133   49088 type.go:168] "Request Body" body=""
	I1202 19:10:45.924208   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:45.924557   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:46.423281   49088 type.go:168] "Request Body" body=""
	I1202 19:10:46.423365   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:46.423700   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:46.423760   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:46.923435   49088 type.go:168] "Request Body" body=""
	I1202 19:10:46.923504   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:46.923772   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:47.423542   49088 type.go:168] "Request Body" body=""
	I1202 19:10:47.423616   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:47.423946   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:47.923718   49088 type.go:168] "Request Body" body=""
	I1202 19:10:47.923790   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:47.924128   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:48.423446   49088 type.go:168] "Request Body" body=""
	I1202 19:10:48.423517   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:48.423772   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:48.423814   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:48.923540   49088 type.go:168] "Request Body" body=""
	I1202 19:10:48.923616   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:48.923948   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:49.423625   49088 type.go:168] "Request Body" body=""
	I1202 19:10:49.423703   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:49.424044   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:49.923749   49088 type.go:168] "Request Body" body=""
	I1202 19:10:49.923817   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:49.924118   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:50.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:10:50.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:50.423386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:50.923086   49088 type.go:168] "Request Body" body=""
	I1202 19:10:50.923163   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:50.923498   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:50.923558   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:51.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:10:51.422947   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:51.423236   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:51.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:51.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:51.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:52.423002   49088 type.go:168] "Request Body" body=""
	I1202 19:10:52.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:52.423410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:52.922879   49088 type.go:168] "Request Body" body=""
	I1202 19:10:52.922948   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:52.923224   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:53.422960   49088 type.go:168] "Request Body" body=""
	I1202 19:10:53.423034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:53.423361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:53.423412   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:53.923077   49088 type.go:168] "Request Body" body=""
	I1202 19:10:53.923157   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:53.923495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:54.422917   49088 type.go:168] "Request Body" body=""
	I1202 19:10:54.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:54.423246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:54.923135   49088 type.go:168] "Request Body" body=""
	I1202 19:10:54.923208   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:54.923556   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:55.423559   49088 type.go:168] "Request Body" body=""
	I1202 19:10:55.423636   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:55.423971   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:55.424022   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:55.923605   49088 type.go:168] "Request Body" body=""
	I1202 19:10:55.923678   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:55.923946   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:56.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:10:56.423787   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:56.424129   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:56.923785   49088 type.go:168] "Request Body" body=""
	I1202 19:10:56.923856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:56.924173   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:57.423656   49088 type.go:168] "Request Body" body=""
	I1202 19:10:57.423734   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:57.423996   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:57.923693   49088 type.go:168] "Request Body" body=""
	I1202 19:10:57.923764   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:57.924078   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:57.924131   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:58.423895   49088 type.go:168] "Request Body" body=""
	I1202 19:10:58.423973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:58.424286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:58.922875   49088 type.go:168] "Request Body" body=""
	I1202 19:10:58.922950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:58.923200   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:59.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:59.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:59.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:59.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:59.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:59.923368   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:00.423287   49088 type.go:168] "Request Body" body=""
	I1202 19:11:00.423360   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:00.423657   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:00.423700   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:00.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:11:00.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:00.923393   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:01.423104   49088 type.go:168] "Request Body" body=""
	I1202 19:11:01.423186   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:01.423527   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:01.923028   49088 type.go:168] "Request Body" body=""
	I1202 19:11:01.923097   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:01.923355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:02.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:11:02.423036   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:02.423393   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:02.923093   49088 type.go:168] "Request Body" body=""
	I1202 19:11:02.923169   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:02.923483   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:02.923543   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:03.422923   49088 type.go:168] "Request Body" body=""
	I1202 19:11:03.422993   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:03.423275   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:03.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:03.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:03.923401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:04.423109   49088 type.go:168] "Request Body" body=""
	I1202 19:11:04.423183   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:04.423480   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:04.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:11:04.923029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:04.923291   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:05.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:11:05.423019   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:05.423416   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:05.423485   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:05.922975   49088 type.go:168] "Request Body" body=""
	I1202 19:11:05.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:05.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:06.423068   49088 type.go:168] "Request Body" body=""
	I1202 19:11:06.423143   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:06.423404   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:06.922954   49088 type.go:168] "Request Body" body=""
	I1202 19:11:06.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:06.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:07.423090   49088 type.go:168] "Request Body" body=""
	I1202 19:11:07.423173   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:07.423511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:07.423577   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:07.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:11:07.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:07.923477   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:08.423196   49088 type.go:168] "Request Body" body=""
	I1202 19:11:08.423268   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:08.423618   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:08.923317   49088 type.go:168] "Request Body" body=""
	I1202 19:11:08.923395   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:08.923714   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:09.423471   49088 type.go:168] "Request Body" body=""
	I1202 19:11:09.423536   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:09.423793   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:09.423831   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:09.923592   49088 type.go:168] "Request Body" body=""
	I1202 19:11:09.923672   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:09.923995   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:10.422883   49088 type.go:168] "Request Body" body=""
	I1202 19:11:10.422965   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:10.423316   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:10.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:10.923035   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:10.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:11.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:11:11.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:11.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:11.923095   49088 type.go:168] "Request Body" body=""
	I1202 19:11:11.923169   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:11.923521   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:11.923576   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:12.423858   49088 type.go:168] "Request Body" body=""
	I1202 19:11:12.423929   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:12.424221   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:12.922920   49088 type.go:168] "Request Body" body=""
	I1202 19:11:12.923007   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:12.923376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:13.423103   49088 type.go:168] "Request Body" body=""
	I1202 19:11:13.423187   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:13.423573   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:13.923844   49088 type.go:168] "Request Body" body=""
	I1202 19:11:13.923913   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:13.924261   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:13.924357   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:14.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:11:14.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:14.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:14.923176   49088 type.go:168] "Request Body" body=""
	I1202 19:11:14.923259   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:14.923607   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:15.423538   49088 type.go:168] "Request Body" body=""
	I1202 19:11:15.423610   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:15.423918   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:15.923744   49088 type.go:168] "Request Body" body=""
	I1202 19:11:15.923820   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:15.924119   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:16.423897   49088 type.go:168] "Request Body" body=""
	I1202 19:11:16.423968   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:16.424281   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:16.424354   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:16.922924   49088 type.go:168] "Request Body" body=""
	I1202 19:11:16.923006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:16.923265   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:17.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:11:17.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:17.423367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:17.922982   49088 type.go:168] "Request Body" body=""
	I1202 19:11:17.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:17.923356   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:18.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:11:18.422949   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:18.423201   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:18.922936   49088 type.go:168] "Request Body" body=""
	I1202 19:11:18.923013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:18.923346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:18.923404   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:19.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:11:19.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:19.423374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:19.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:11:19.923009   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:19.923288   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:20.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:11:20.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:20.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:20.923042   49088 type.go:168] "Request Body" body=""
	I1202 19:11:20.923120   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:20.923474   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:20.923528   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:21.423170   49088 type.go:168] "Request Body" body=""
	I1202 19:11:21.423246   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:21.423533   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:21.922987   49088 type.go:168] "Request Body" body=""
	I1202 19:11:21.923069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:21.923428   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:22.423136   49088 type.go:168] "Request Body" body=""
	I1202 19:11:22.423218   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:22.423580   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:22.923816   49088 type.go:168] "Request Body" body=""
	I1202 19:11:22.923890   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:22.924148   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:22.924186   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:23.422866   49088 type.go:168] "Request Body" body=""
	I1202 19:11:23.422936   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:23.423286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:23.923262   49088 type.go:168] "Request Body" body=""
	I1202 19:11:23.923352   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:23.923994   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:24.422942   49088 type.go:168] "Request Body" body=""
	I1202 19:11:24.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:24.423374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:24.922937   49088 type.go:168] "Request Body" body=""
	I1202 19:11:24.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:24.923429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:25.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:11:25.423008   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:25.423357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:25.423414   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:25.922926   49088 type.go:168] "Request Body" body=""
	I1202 19:11:25.922991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:25.923253   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:26.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:11:26.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:26.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:26.923094   49088 type.go:168] "Request Body" body=""
	I1202 19:11:26.923165   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:26.923469   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:27.422920   49088 type.go:168] "Request Body" body=""
	I1202 19:11:27.422991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:27.423278   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:27.922888   49088 type.go:168] "Request Body" body=""
	I1202 19:11:27.922961   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:27.923314   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:27.923368   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:28.422912   49088 type.go:168] "Request Body" body=""
	I1202 19:11:28.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:28.423375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:28.923772   49088 type.go:168] "Request Body" body=""
	I1202 19:11:28.923838   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:28.924083   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:29.423850   49088 type.go:168] "Request Body" body=""
	I1202 19:11:29.423927   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:29.424260   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:29.923896   49088 type.go:168] "Request Body" body=""
	I1202 19:11:29.923968   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:29.924284   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:29.924358   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:30.423879   49088 type.go:168] "Request Body" body=""
	I1202 19:11:30.423953   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:30.424220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:30.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:11:30.923004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:30.923350   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:31.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:11:31.423147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:31.423507   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:31.922953   49088 type.go:168] "Request Body" body=""
	I1202 19:11:31.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:31.923337   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:32.422971   49088 type.go:168] "Request Body" body=""
	I1202 19:11:32.423046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:32.423366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:32.423417   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:32.923034   49088 type.go:168] "Request Body" body=""
	I1202 19:11:32.923109   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:32.923438   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:33.423135   49088 type.go:168] "Request Body" body=""
	I1202 19:11:33.423204   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:33.423464   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:33.922974   49088 type.go:168] "Request Body" body=""
	I1202 19:11:33.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:33.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:34.423091   49088 type.go:168] "Request Body" body=""
	I1202 19:11:34.423168   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:34.423523   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:34.423580   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:34.923239   49088 type.go:168] "Request Body" body=""
	I1202 19:11:34.923307   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:34.923573   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:35.423667   49088 type.go:168] "Request Body" body=""
	I1202 19:11:35.423743   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:35.424088   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:35.923861   49088 type.go:168] "Request Body" body=""
	I1202 19:11:35.923938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:35.924296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:36.422936   49088 type.go:168] "Request Body" body=""
	I1202 19:11:36.423001   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:36.423257   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:36.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:11:36.923029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:36.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:36.923429   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:37.422938   49088 type.go:168] "Request Body" body=""
	I1202 19:11:37.423016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:37.423347   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:37.923043   49088 type.go:168] "Request Body" body=""
	I1202 19:11:37.923123   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:37.923400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:38.422941   49088 type.go:168] "Request Body" body=""
	I1202 19:11:38.423011   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:38.423321   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:38.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:11:38.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:38.923352   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:39.422867   49088 type.go:168] "Request Body" body=""
	I1202 19:11:39.422947   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:39.423239   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:39.423286   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:39.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:11:39.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:39.923373   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:40.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:11:40.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:40.423412   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:40.923647   49088 type.go:168] "Request Body" body=""
	I1202 19:11:40.923717   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:40.923978   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:41.423742   49088 type.go:168] "Request Body" body=""
	I1202 19:11:41.423821   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:41.424168   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:41.424221   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:41.922872   49088 type.go:168] "Request Body" body=""
	I1202 19:11:41.922945   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:41.923310   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:42.423001   49088 type.go:168] "Request Body" body=""
	I1202 19:11:42.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:42.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:42.922973   49088 type.go:168] "Request Body" body=""
	I1202 19:11:42.923054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:42.923395   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:43.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:11:43.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:43.423368   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:43.922915   49088 type.go:168] "Request Body" body=""
	I1202 19:11:43.922986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:43.923252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:43.923292   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:44.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:11:44.423058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:44.423399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:44.923181   49088 type.go:168] "Request Body" body=""
	I1202 19:11:44.923261   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:44.923585   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:45.423566   49088 type.go:168] "Request Body" body=""
	I1202 19:11:45.423644   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:45.423912   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:45.923624   49088 type.go:168] "Request Body" body=""
	I1202 19:11:45.923703   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:45.924023   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:45.924071   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:46.423663   49088 type.go:168] "Request Body" body=""
	I1202 19:11:46.423734   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:46.424056   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:46.923091   49088 type.go:168] "Request Body" body=""
	I1202 19:11:46.923157   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:46.923456   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:47.423153   49088 type.go:168] "Request Body" body=""
	I1202 19:11:47.423230   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:47.423569   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:47.923278   49088 type.go:168] "Request Body" body=""
	I1202 19:11:47.923362   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:47.923689   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:48.422896   49088 type.go:168] "Request Body" body=""
	I1202 19:11:48.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:48.423231   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:48.423281   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:48.922955   49088 type.go:168] "Request Body" body=""
	I1202 19:11:48.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:48.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:49.423094   49088 type.go:168] "Request Body" body=""
	I1202 19:11:49.423172   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:49.423529   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:49.923206   49088 type.go:168] "Request Body" body=""
	I1202 19:11:49.923275   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:49.923532   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:50.423633   49088 type.go:168] "Request Body" body=""
	I1202 19:11:50.423711   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:50.424022   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:50.424076   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:50.923805   49088 type.go:168] "Request Body" body=""
	I1202 19:11:50.923878   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:50.924219   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:51.423760   49088 type.go:168] "Request Body" body=""
	I1202 19:11:51.423833   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:51.424113   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:51.923870   49088 type.go:168] "Request Body" body=""
	I1202 19:11:51.923950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:51.924286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:52.422868   49088 type.go:168] "Request Body" body=""
	I1202 19:11:52.422952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:52.423285   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:52.923892   49088 type.go:168] "Request Body" body=""
	I1202 19:11:52.923962   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:52.924248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:52.924291   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:53.422923   49088 type.go:168] "Request Body" body=""
	I1202 19:11:53.423002   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:53.423357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:53.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:11:53.923041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:53.923384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:54.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:11:54.422991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:54.423296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:54.922927   49088 type.go:168] "Request Body" body=""
	I1202 19:11:54.923011   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:54.923324   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:55.423007   49088 type.go:168] "Request Body" body=""
	I1202 19:11:55.423084   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:55.423406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:55.423459   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:55.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:55.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:55.923318   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:56.423006   49088 type.go:168] "Request Body" body=""
	I1202 19:11:56.423078   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:56.423413   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:56.923138   49088 type.go:168] "Request Body" body=""
	I1202 19:11:56.923215   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:56.923566   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:57.423251   49088 type.go:168] "Request Body" body=""
	I1202 19:11:57.423331   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:57.423624   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:57.423682   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:57.923555   49088 type.go:168] "Request Body" body=""
	I1202 19:11:57.923629   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:57.923962   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:58.423744   49088 type.go:168] "Request Body" body=""
	I1202 19:11:58.423824   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:58.424157   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:58.923666   49088 type.go:168] "Request Body" body=""
	I1202 19:11:58.923737   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:58.924002   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:59.423779   49088 type.go:168] "Request Body" body=""
	I1202 19:11:59.423856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:59.424204   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:59.424262   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:59.922940   49088 type.go:168] "Request Body" body=""
	I1202 19:11:59.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:59.923350   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:00.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:12:00.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:00.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:00.923010   49088 type.go:168] "Request Body" body=""
	I1202 19:12:00.923091   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:00.923432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:01.423015   49088 type.go:168] "Request Body" body=""
	I1202 19:12:01.423098   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:01.423440   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:01.923901   49088 type.go:168] "Request Body" body=""
	I1202 19:12:01.923978   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:01.924301   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:01.924373   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:02.423047   49088 type.go:168] "Request Body" body=""
	I1202 19:12:02.423120   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:02.423479   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:02.923197   49088 type.go:168] "Request Body" body=""
	I1202 19:12:02.923275   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:02.923599   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:03.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:12:03.422989   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:03.423273   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:03.922947   49088 type.go:168] "Request Body" body=""
	I1202 19:12:03.923023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:03.923352   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:04.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:12:04.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:04.423399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:04.423455   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:04.923130   49088 type.go:168] "Request Body" body=""
	I1202 19:12:04.923196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:04.923453   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:05.423379   49088 type.go:168] "Request Body" body=""
	I1202 19:12:05.423457   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:05.423797   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:05.923568   49088 type.go:168] "Request Body" body=""
	I1202 19:12:05.923643   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:05.923966   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:06.423488   49088 type.go:168] "Request Body" body=""
	I1202 19:12:06.423556   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:06.423829   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:06.423875   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:06.923674   49088 type.go:168] "Request Body" body=""
	I1202 19:12:06.923754   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:06.924076   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:07.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:12:07.423954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:07.424301   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:07.923933   49088 type.go:168] "Request Body" body=""
	I1202 19:12:07.924013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:07.924280   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:08.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:12:08.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:08.423411   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:08.923117   49088 type.go:168] "Request Body" body=""
	I1202 19:12:08.923190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:08.923511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:08.923573   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:09.422927   49088 type.go:168] "Request Body" body=""
	I1202 19:12:09.422996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:09.423311   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:09.923024   49088 type.go:168] "Request Body" body=""
	I1202 19:12:09.923100   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:09.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:10.422995   49088 type.go:168] "Request Body" body=""
	I1202 19:12:10.423070   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:10.423406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:10.922922   49088 type.go:168] "Request Body" body=""
	I1202 19:12:10.922997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:10.923336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:11.422925   49088 type.go:168] "Request Body" body=""
	I1202 19:12:11.423002   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:11.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:11.423398   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:11.923073   49088 type.go:168] "Request Body" body=""
	I1202 19:12:11.923147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:11.923465   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:12.422920   49088 type.go:168] "Request Body" body=""
	I1202 19:12:12.422996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:12.423266   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:12.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:12:12.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:12.923408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:13.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:12:13.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:13.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:13.423447   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:13.923740   49088 type.go:168] "Request Body" body=""
	I1202 19:12:13.923807   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:13.924068   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:14.423836   49088 type.go:168] "Request Body" body=""
	I1202 19:12:14.423917   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:14.424266   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:14.922896   49088 type.go:168] "Request Body" body=""
	I1202 19:12:14.922969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:14.923318   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:15.423109   49088 type.go:168] "Request Body" body=""
	I1202 19:12:15.423181   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:15.423435   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:15.423486   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:15.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:12:15.923045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:15.923399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:16.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:12:16.423056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:16.423395   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:16.922950   49088 type.go:168] "Request Body" body=""
	I1202 19:12:16.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:16.923357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:17.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:12:17.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:17.423403   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:17.923097   49088 type.go:168] "Request Body" body=""
	I1202 19:12:17.923175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:17.923503   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:17.923560   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:18.423204   49088 type.go:168] "Request Body" body=""
	I1202 19:12:18.423269   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:18.423531   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:18.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:12:18.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:18.923383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:19.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:12:19.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:19.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:19.923083   49088 type.go:168] "Request Body" body=""
	I1202 19:12:19.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:19.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:20.423485   49088 type.go:168] "Request Body" body=""
	I1202 19:12:20.423567   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:20.423913   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:20.423967   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:20.923722   49088 type.go:168] "Request Body" body=""
	I1202 19:12:20.923795   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:20.924168   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:21.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:12:21.422998   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:21.423294   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:21.922979   49088 type.go:168] "Request Body" body=""
	I1202 19:12:21.923058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:21.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:22.422989   49088 type.go:168] "Request Body" body=""
	I1202 19:12:22.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:22.423414   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:22.923563   49088 type.go:168] "Request Body" body=""
	I1202 19:12:22.923637   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:22.923893   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:22.923932   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:23.423651   49088 type.go:168] "Request Body" body=""
	I1202 19:12:23.423730   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:23.424077   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:23.923725   49088 type.go:168] "Request Body" body=""
	I1202 19:12:23.923795   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:23.924130   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:24.423644   49088 type.go:168] "Request Body" body=""
	I1202 19:12:24.423724   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:24.424004   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:24.923060   49088 type.go:168] "Request Body" body=""
	I1202 19:12:24.923142   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:24.923487   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:25.423281   49088 type.go:168] "Request Body" body=""
	I1202 19:12:25.423364   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:25.423721   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:25.423779   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:25.923482   49088 type.go:168] "Request Body" body=""
	I1202 19:12:25.923550   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:25.923808   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:26.423542   49088 type.go:168] "Request Body" body=""
	I1202 19:12:26.423613   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:26.423935   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:26.923724   49088 type.go:168] "Request Body" body=""
	I1202 19:12:26.923807   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:26.924187   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:27.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:12:27.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:27.423252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:27.922940   49088 type.go:168] "Request Body" body=""
	I1202 19:12:27.923019   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:27.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:27.923403   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:28.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:12:28.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:28.423432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:28.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:12:28.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:28.923349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:29.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:12:29.423092   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:29.423421   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:29.923070   49088 type.go:168] "Request Body" body=""
	I1202 19:12:29.923147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:29.923486   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:29.923558   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:30.423578   49088 type.go:168] "Request Body" body=""
	I1202 19:12:30.423659   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:30.423950   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:30.923208   49088 type.go:168] "Request Body" body=""
	I1202 19:12:30.923281   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:30.923639   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:31.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:12:31.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:31.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:31.922934   49088 type.go:168] "Request Body" body=""
	I1202 19:12:31.923000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:31.923257   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:32.422953   49088 type.go:168] "Request Body" body=""
	I1202 19:12:32.423028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:32.423354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:32.423413   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:32.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:12:32.923168   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:32.923564   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:33.423262   49088 type.go:168] "Request Body" body=""
	I1202 19:12:33.423340   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:33.423598   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:33.922973   49088 type.go:168] "Request Body" body=""
	I1202 19:12:33.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:33.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:34.422949   49088 type.go:168] "Request Body" body=""
	I1202 19:12:34.423022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:34.423336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:34.922878   49088 type.go:168] "Request Body" body=""
	I1202 19:12:34.922943   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:34.923200   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:34.923239   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:35.423165   49088 type.go:168] "Request Body" body=""
	I1202 19:12:35.423238   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:35.423568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:35.923256   49088 type.go:168] "Request Body" body=""
	I1202 19:12:35.923329   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:35.923637   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:36.422898   49088 type.go:168] "Request Body" body=""
	I1202 19:12:36.422965   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:36.423218   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:36.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:12:36.923035   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:36.923355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:36.923409   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:37.423086   49088 type.go:168] "Request Body" body=""
	I1202 19:12:37.423161   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:37.423449   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:37.923834   49088 type.go:168] "Request Body" body=""
	I1202 19:12:37.923903   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:37.924159   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:38.422906   49088 type.go:168] "Request Body" body=""
	I1202 19:12:38.422977   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:38.423261   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:38.922970   49088 type.go:168] "Request Body" body=""
	I1202 19:12:38.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:38.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:38.923440   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:39.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:12:39.423788   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:39.424096   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:39.923879   49088 type.go:168] "Request Body" body=""
	I1202 19:12:39.923950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:39.924267   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:40.423003   49088 type.go:168] "Request Body" body=""
	I1202 19:12:40.423081   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:40.423385   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:40.923058   49088 type.go:168] "Request Body" body=""
	I1202 19:12:40.923129   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:40.923426   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:40.923486   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:41.422953   49088 type.go:168] "Request Body" body=""
	I1202 19:12:41.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:41.423343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:41.923078   49088 type.go:168] "Request Body" body=""
	I1202 19:12:41.923156   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:41.923473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:42.423139   49088 type.go:168] "Request Body" body=""
	I1202 19:12:42.423204   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:42.423502   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:42.923011   49088 type.go:168] "Request Body" body=""
	I1202 19:12:42.923090   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:42.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:43.423000   49088 type.go:168] "Request Body" body=""
	I1202 19:12:43.423079   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:43.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:43.423446   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:43.923623   49088 type.go:168] "Request Body" body=""
	I1202 19:12:43.923696   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:43.923958   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:44.423707   49088 type.go:168] "Request Body" body=""
	I1202 19:12:44.423805   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:44.424192   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:44.923041   49088 type.go:168] "Request Body" body=""
	I1202 19:12:44.923114   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:44.923460   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:45.423487   49088 type.go:168] "Request Body" body=""
	I1202 19:12:45.423555   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:45.423816   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:45.423856   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:45.923602   49088 type.go:168] "Request Body" body=""
	I1202 19:12:45.923678   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:45.924003   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:46.423772   49088 type.go:168] "Request Body" body=""
	I1202 19:12:46.423843   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:46.424193   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:46.923702   49088 type.go:168] "Request Body" body=""
	I1202 19:12:46.923775   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:46.924028   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:47.423742   49088 type.go:168] "Request Body" body=""
	I1202 19:12:47.423811   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:47.424126   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:47.424184   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:47.923682   49088 type.go:168] "Request Body" body=""
	I1202 19:12:47.923759   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:47.924070   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:48.423650   49088 type.go:168] "Request Body" body=""
	I1202 19:12:48.423719   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:48.423981   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:48.923704   49088 type.go:168] "Request Body" body=""
	I1202 19:12:48.923774   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:48.924090   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:49.423900   49088 type.go:168] "Request Body" body=""
	I1202 19:12:49.423983   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:49.424354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:49.424408   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:49.922950   49088 type.go:168] "Request Body" body=""
	I1202 19:12:49.923023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:49.923365   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:50.423243   49088 type.go:168] "Request Body" body=""
	I1202 19:12:50.423322   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:50.423653   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:50.922964   49088 type.go:168] "Request Body" body=""
	I1202 19:12:50.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:50.923377   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:51.422954   49088 type.go:168] "Request Body" body=""
	I1202 19:12:51.423023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:51.423346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:51.923047   49088 type.go:168] "Request Body" body=""
	I1202 19:12:51.923126   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:51.923460   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:51.923513   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:52.422937   49088 type.go:168] "Request Body" body=""
	I1202 19:12:52.423014   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:52.423303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:52.922929   49088 type.go:168] "Request Body" body=""
	I1202 19:12:52.923016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:52.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:53.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:12:53.423100   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:53.423482   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:53.922987   49088 type.go:168] "Request Body" body=""
	I1202 19:12:53.923061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:53.923413   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:54.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:12:54.422983   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:54.423246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:54.423296   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:54.923072   49088 type.go:168] "Request Body" body=""
	I1202 19:12:54.923153   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:54.923523   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:55.423518   49088 type.go:168] "Request Body" body=""
	I1202 19:12:55.423603   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:55.423968   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:55.923602   49088 type.go:168] "Request Body" body=""
	I1202 19:12:55.923672   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:55.924000   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:56.423806   49088 type.go:168] "Request Body" body=""
	I1202 19:12:56.423881   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:56.424245   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:56.424298   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:56.923893   49088 type.go:168] "Request Body" body=""
	I1202 19:12:56.923967   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:56.924355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:57.422930   49088 type.go:168] "Request Body" body=""
	I1202 19:12:57.422997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:57.423287   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:57.922958   49088 type.go:168] "Request Body" body=""
	I1202 19:12:57.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:57.923341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:58.423065   49088 type.go:168] "Request Body" body=""
	I1202 19:12:58.423137   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:58.423498   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:58.923185   49088 type.go:168] "Request Body" body=""
	I1202 19:12:58.923255   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:58.923518   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:58.923557   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:59.422988   49088 type.go:168] "Request Body" body=""
	I1202 19:12:59.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:59.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:59.923116   49088 type.go:168] "Request Body" body=""
	I1202 19:12:59.923196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:59.923537   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:00.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:13:00.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:00.423421   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:00.922957   49088 type.go:168] "Request Body" body=""
	I1202 19:13:00.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:00.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:01.423025   49088 type.go:168] "Request Body" body=""
	I1202 19:13:01.423098   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:01.423429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:01.423485   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:01.923135   49088 type.go:168] "Request Body" body=""
	I1202 19:13:01.923210   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:01.923470   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:02.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:13:02.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:02.423386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:02.923090   49088 type.go:168] "Request Body" body=""
	I1202 19:13:02.923194   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:02.923514   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:03.423202   49088 type.go:168] "Request Body" body=""
	I1202 19:13:03.423271   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:03.423580   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:03.423632   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:03.922949   49088 type.go:168] "Request Body" body=""
	I1202 19:13:03.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:03.923344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:04.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:13:04.423049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:04.423409   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:04.923130   49088 type.go:168] "Request Body" body=""
	I1202 19:13:04.923198   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:04.923485   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:05.423469   49088 type.go:168] "Request Body" body=""
	I1202 19:13:05.423540   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:05.423887   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:05.423943   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:05.923727   49088 type.go:168] "Request Body" body=""
	I1202 19:13:05.923801   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:05.924115   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:06.423862   49088 type.go:168] "Request Body" body=""
	I1202 19:13:06.423938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:06.424199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:06.922912   49088 type.go:168] "Request Body" body=""
	I1202 19:13:06.922984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:06.923325   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:07.422978   49088 type.go:168] "Request Body" body=""
	I1202 19:13:07.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:07.423373   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:07.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:13:07.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:07.923258   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:07.923298   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:08.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:13:08.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:08.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:08.922933   49088 type.go:168] "Request Body" body=""
	I1202 19:13:08.923006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:08.923329   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:09.422864   49088 type.go:168] "Request Body" body=""
	I1202 19:13:09.422931   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:09.423213   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:09.922936   49088 type.go:168] "Request Body" body=""
	I1202 19:13:09.923016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:09.923330   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:09.923387   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:10.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:13:10.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:10.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:10.922900   49088 type.go:168] "Request Body" body=""
	I1202 19:13:10.922972   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:10.923227   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:11.422967   49088 type.go:168] "Request Body" body=""
	I1202 19:13:11.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:11.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:11.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:13:11.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:11.923347   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:12.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:13:12.422981   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:12.423291   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:12.423344   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:12.922990   49088 type.go:168] "Request Body" body=""
	I1202 19:13:12.923081   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:12.923443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:13.423156   49088 type.go:168] "Request Body" body=""
	I1202 19:13:13.423234   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:13.423549   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:13.922900   49088 type.go:168] "Request Body" body=""
	I1202 19:13:13.922969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:13.923235   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:14.422957   49088 type.go:168] "Request Body" body=""
	I1202 19:13:14.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:14.423396   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:14.423449   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:14.923039   49088 type.go:168] "Request Body" body=""
	I1202 19:13:14.923111   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:14.923397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:15.423223   49088 type.go:168] "Request Body" body=""
	I1202 19:13:15.423302   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:15.423557   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:15.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:13:15.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:15.923367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:16.423062   49088 type.go:168] "Request Body" body=""
	I1202 19:13:16.423140   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:16.423473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:16.423529   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:16.923839   49088 type.go:168] "Request Body" body=""
	I1202 19:13:16.923917   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:16.924188   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:17.422894   49088 type.go:168] "Request Body" body=""
	I1202 19:13:17.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:17.423308   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:17.922909   49088 type.go:168] "Request Body" body=""
	I1202 19:13:17.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:17.923348   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:18.423042   49088 type.go:168] "Request Body" body=""
	I1202 19:13:18.423112   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:18.423378   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:18.923054   49088 type.go:168] "Request Body" body=""
	I1202 19:13:18.923129   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:18.923451   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:18.923507   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:19.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:13:19.423063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:19.423405   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:19.923563   49088 type.go:168] "Request Body" body=""
	I1202 19:13:19.923634   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:19.923942   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:20.423766   49088 type.go:168] "Request Body" body=""
	I1202 19:13:20.423836   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:20.424183   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:20.922889   49088 type.go:168] "Request Body" body=""
	I1202 19:13:20.922963   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:20.923286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:21.422910   49088 type.go:168] "Request Body" body=""
	I1202 19:13:21.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:21.423265   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:21.423304   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:21.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:13:21.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:21.923372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:22.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:13:22.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:22.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:22.923660   49088 type.go:168] "Request Body" body=""
	I1202 19:13:22.923726   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:22.923989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:23.423852   49088 type.go:168] "Request Body" body=""
	I1202 19:13:23.423930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:23.424355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:23.424410   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:23.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:13:23.923088   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:23.923457   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:24.423150   49088 type.go:168] "Request Body" body=""
	I1202 19:13:24.423233   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:24.423566   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:24.923507   49088 type.go:168] "Request Body" body=""
	I1202 19:13:24.923591   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:24.923948   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:25.423717   49088 type.go:168] "Request Body" body=""
	I1202 19:13:25.423797   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:25.424161   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:25.922841   49088 node_ready.go:38] duration metric: took 6m0.000085627s for node "functional-449836" to be "Ready" ...
	I1202 19:13:25.925875   49088 out.go:203] 
	W1202 19:13:25.928738   49088 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1202 19:13:25.928760   49088 out.go:285] * 
	W1202 19:13:25.930899   49088 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 19:13:25.934748   49088 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:13:33 functional-449836 containerd[5842]: time="2025-12-02T19:13:33.645009599Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.3\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:34 functional-449836 containerd[5842]: time="2025-12-02T19:13:34.681117821Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 02 19:13:34 functional-449836 containerd[5842]: time="2025-12-02T19:13:34.683398192Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 02 19:13:34 functional-449836 containerd[5842]: time="2025-12-02T19:13:34.693674136Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:34 functional-449836 containerd[5842]: time="2025-12-02T19:13:34.694210924Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:35 functional-449836 containerd[5842]: time="2025-12-02T19:13:35.662714060Z" level=info msg="No images store for sha256:2eaa477b07fa94239065ddfa3c63972bc774ee1ebce5861cf639d04e0692e711"
	Dec 02 19:13:35 functional-449836 containerd[5842]: time="2025-12-02T19:13:35.665361216Z" level=info msg="ImageCreate event name:\"docker.io/library/minikube-local-cache-test:functional-449836\""
	Dec 02 19:13:35 functional-449836 containerd[5842]: time="2025-12-02T19:13:35.678350434Z" level=info msg="ImageCreate event name:\"sha256:9d59d8178e3a4c209f1c923737212d2bc54133c85455f4f8e051d069a9d30853\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:35 functional-449836 containerd[5842]: time="2025-12-02T19:13:35.678839993Z" level=info msg="ImageUpdate event name:\"docker.io/library/minikube-local-cache-test:functional-449836\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:36 functional-449836 containerd[5842]: time="2025-12-02T19:13:36.465220542Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\""
	Dec 02 19:13:36 functional-449836 containerd[5842]: time="2025-12-02T19:13:36.468232761Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:latest\""
	Dec 02 19:13:36 functional-449836 containerd[5842]: time="2025-12-02T19:13:36.470595363Z" level=info msg="ImageDelete event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\""
	Dec 02 19:13:36 functional-449836 containerd[5842]: time="2025-12-02T19:13:36.483876889Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\" returns successfully"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.376758474Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\""
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.379358582Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:3.1\""
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.382369874Z" level=info msg="ImageDelete event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\""
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.389733563Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\" returns successfully"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.509877470Z" level=info msg="No images store for sha256:3ac89611d5efd8eb74174b1f04c33b7e73b651cec35b5498caf0cfdd2efd7d48"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.512059256Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.1\""
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.519058867Z" level=info msg="ImageCreate event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.519608110Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.1\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.678879163Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.681070623Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.692574195Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.692993650Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:13:39.448219    9783 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:39.448996    9783 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:39.450639    9783 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:39.450955    9783 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:39.452542    9783 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:13:39 up 55 min,  0 user,  load average: 0.50, 0.36, 0.52
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:13:36 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:37 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 825.
	Dec 02 19:13:37 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:37 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:37 functional-449836 kubelet[9576]: E1202 19:13:37.225510    9576 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:37 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:37 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:37 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 826.
	Dec 02 19:13:37 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:37 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:37 functional-449836 kubelet[9669]: E1202 19:13:37.975764    9669 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:37 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:37 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:38 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 827.
	Dec 02 19:13:38 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:38 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:38 functional-449836 kubelet[9698]: E1202 19:13:38.735624    9698 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:38 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:38 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:39 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 828.
	Dec 02 19:13:39 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:39 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:39 functional-449836 kubelet[9788]: E1202 19:13:39.491201    9788 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:39 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:39 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (374.699884ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmd (2.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly (2.33s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-449836 get pods
functional_test.go:756: (dbg) Non-zero exit: out/kubectl --context functional-449836 get pods: exit status 1 (101.908886ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:759: failed to run kubectl directly. args "out/kubectl --context functional-449836 get pods": exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (304.617085ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-224594 image ls --format short --alsologtostderr                                                                                             │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image ls --format yaml --alsologtostderr                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh     │ functional-224594 ssh pgrep buildkitd                                                                                                                   │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ image   │ functional-224594 image ls --format json --alsologtostderr                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image ls --format table --alsologtostderr                                                                                             │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image build -t localhost/my-image:functional-224594 testdata/build --alsologtostderr                                                  │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image ls                                                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ delete  │ -p functional-224594                                                                                                                                    │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ start   │ -p functional-449836 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ start   │ -p functional-449836 --alsologtostderr -v=8                                                                                                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:07 UTC │                     │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:latest                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add minikube-local-cache-test:functional-449836                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache delete minikube-local-cache-test:functional-449836                                                                              │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl images                                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	│ cache   │ functional-449836 cache reload                                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ kubectl │ functional-449836 kubectl -- --context functional-449836 get pods                                                                                       │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:07:19
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:07:19.929855   49088 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:07:19.930082   49088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:07:19.930109   49088 out.go:374] Setting ErrFile to fd 2...
	I1202 19:07:19.930127   49088 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:07:19.930424   49088 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:07:19.930829   49088 out.go:368] Setting JSON to false
	I1202 19:07:19.931678   49088 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":2976,"bootTime":1764699464,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:07:19.931776   49088 start.go:143] virtualization:  
	I1202 19:07:19.935245   49088 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:07:19.939094   49088 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:07:19.939188   49088 notify.go:221] Checking for updates...
	I1202 19:07:19.944799   49088 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:07:19.947646   49088 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:19.950501   49088 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:07:19.953361   49088 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:07:19.956281   49088 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:07:19.959695   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:19.959887   49088 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:07:19.996438   49088 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:07:19.996577   49088 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:07:20.063124   49088 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:07:20.053388152 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:07:20.063232   49088 docker.go:319] overlay module found
	I1202 19:07:20.066390   49088 out.go:179] * Using the docker driver based on existing profile
	I1202 19:07:20.069271   49088 start.go:309] selected driver: docker
	I1202 19:07:20.069311   49088 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:20.069422   49088 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:07:20.069541   49088 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:07:20.132012   49088 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:07:20.122627931 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:07:20.132615   49088 cni.go:84] Creating CNI manager for ""
	I1202 19:07:20.132692   49088 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:07:20.132751   49088 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPa
th: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:20.135845   49088 out.go:179] * Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	I1202 19:07:20.138639   49088 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 19:07:20.141498   49088 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 19:07:20.144479   49088 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 19:07:20.144604   49088 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:07:20.163347   49088 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 19:07:20.163372   49088 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 19:07:20.218193   49088 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 19:07:20.422833   49088 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 19:07:20.423042   49088 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 19:07:20.423128   49088 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423219   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 19:07:20.423234   49088 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 122.125µs
	I1202 19:07:20.423249   49088 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 19:07:20.423267   49088 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423303   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 19:07:20.423312   49088 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 47.557µs
	I1202 19:07:20.423318   49088 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423331   49088 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423365   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 19:07:20.423374   49088 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 44.415µs
	I1202 19:07:20.423380   49088 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423395   49088 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423422   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 19:07:20.423432   49088 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 37.579µs
	I1202 19:07:20.423438   49088 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423447   49088 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423476   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 19:07:20.423484   49088 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 38.933µs
	I1202 19:07:20.423490   49088 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 19:07:20.423510   49088 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423540   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 19:07:20.423549   49088 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 40.796µs
	I1202 19:07:20.423555   49088 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 19:07:20.423569   49088 cache.go:243] Successfully downloaded all kic artifacts
	I1202 19:07:20.423588   49088 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423620   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 19:07:20.423629   49088 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 42.487µs
	I1202 19:07:20.423635   49088 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 19:07:20.423646   49088 start.go:360] acquireMachinesLock for functional-449836: {Name:mk8999fdfa518fc15358d07431fe9bec286a035e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423706   49088 start.go:364] duration metric: took 31.868µs to acquireMachinesLock for "functional-449836"
	I1202 19:07:20.423570   49088 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:07:20.423753   49088 start.go:96] Skipping create...Using existing machine configuration
	I1202 19:07:20.423783   49088 fix.go:54] fixHost starting: 
	I1202 19:07:20.423759   49088 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 19:07:20.423888   49088 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 323.2µs
	I1202 19:07:20.423896   49088 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 19:07:20.423906   49088 cache.go:87] Successfully saved all images to host disk.
	I1202 19:07:20.424111   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:20.441213   49088 fix.go:112] recreateIfNeeded on functional-449836: state=Running err=<nil>
	W1202 19:07:20.441244   49088 fix.go:138] unexpected machine state, will restart: <nil>
	I1202 19:07:20.444707   49088 out.go:252] * Updating the running docker "functional-449836" container ...
	I1202 19:07:20.444749   49088 machine.go:94] provisionDockerMachine start ...
	I1202 19:07:20.444842   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.461943   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.462269   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.462284   49088 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 19:07:20.612055   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:07:20.612125   49088 ubuntu.go:182] provisioning hostname "functional-449836"
	I1202 19:07:20.612222   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.629856   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.630166   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.630180   49088 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-449836 && echo "functional-449836" | sudo tee /etc/hostname
	I1202 19:07:20.793419   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:07:20.793536   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:20.812441   49088 main.go:143] libmachine: Using SSH client type: native
	I1202 19:07:20.812754   49088 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:07:20.812775   49088 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-449836' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-449836/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-449836' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 19:07:20.961443   49088 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 19:07:20.961480   49088 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 19:07:20.961539   49088 ubuntu.go:190] setting up certificates
	I1202 19:07:20.961556   49088 provision.go:84] configureAuth start
	I1202 19:07:20.961634   49088 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:07:20.990731   49088 provision.go:143] copyHostCerts
	I1202 19:07:20.990790   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:07:20.990838   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 19:07:20.990856   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:07:20.990938   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 19:07:20.991037   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:07:20.991060   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 19:07:20.991069   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:07:20.991098   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 19:07:20.991189   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:07:20.991211   49088 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 19:07:20.991220   49088 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:07:20.991247   49088 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 19:07:20.991297   49088 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.functional-449836 san=[127.0.0.1 192.168.49.2 functional-449836 localhost minikube]
	I1202 19:07:21.335552   49088 provision.go:177] copyRemoteCerts
	I1202 19:07:21.335618   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 19:07:21.335658   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.354079   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.460475   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem -> /etc/docker/server.pem
	I1202 19:07:21.460535   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1202 19:07:21.478965   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I1202 19:07:21.479028   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 19:07:21.497363   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I1202 19:07:21.497471   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 19:07:21.514946   49088 provision.go:87] duration metric: took 553.36724ms to configureAuth
	I1202 19:07:21.515020   49088 ubuntu.go:206] setting minikube options for container-runtime
	I1202 19:07:21.515215   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:21.515248   49088 machine.go:97] duration metric: took 1.070490831s to provisionDockerMachine
	I1202 19:07:21.515264   49088 start.go:293] postStartSetup for "functional-449836" (driver="docker")
	I1202 19:07:21.515276   49088 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 19:07:21.515329   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 19:07:21.515382   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.532644   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.636416   49088 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 19:07:21.639685   49088 command_runner.go:130] > PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	I1202 19:07:21.639756   49088 command_runner.go:130] > NAME="Debian GNU/Linux"
	I1202 19:07:21.639777   49088 command_runner.go:130] > VERSION_ID="12"
	I1202 19:07:21.639798   49088 command_runner.go:130] > VERSION="12 (bookworm)"
	I1202 19:07:21.639827   49088 command_runner.go:130] > VERSION_CODENAME=bookworm
	I1202 19:07:21.639832   49088 command_runner.go:130] > ID=debian
	I1202 19:07:21.639847   49088 command_runner.go:130] > HOME_URL="https://www.debian.org/"
	I1202 19:07:21.639859   49088 command_runner.go:130] > SUPPORT_URL="https://www.debian.org/support"
	I1202 19:07:21.639866   49088 command_runner.go:130] > BUG_REPORT_URL="https://bugs.debian.org/"
	I1202 19:07:21.639943   49088 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 19:07:21.639962   49088 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 19:07:21.639974   49088 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 19:07:21.640036   49088 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 19:07:21.640112   49088 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 19:07:21.640123   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> /etc/ssl/certs/44352.pem
	I1202 19:07:21.640204   49088 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> hosts in /etc/test/nested/copy/4435
	I1202 19:07:21.640213   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> /etc/test/nested/copy/4435/hosts
	I1202 19:07:21.640263   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4435
	I1202 19:07:21.647807   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:07:21.664872   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts --> /etc/test/nested/copy/4435/hosts (40 bytes)
	I1202 19:07:21.686465   49088 start.go:296] duration metric: took 171.184702ms for postStartSetup
	I1202 19:07:21.686545   49088 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:07:21.686646   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.708068   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.808826   49088 command_runner.go:130] > 18%
	I1202 19:07:21.809461   49088 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 19:07:21.814183   49088 command_runner.go:130] > 159G
	I1202 19:07:21.814719   49088 fix.go:56] duration metric: took 1.390932828s for fixHost
	I1202 19:07:21.814741   49088 start.go:83] releasing machines lock for "functional-449836", held for 1.391011327s
	I1202 19:07:21.814809   49088 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:07:21.831833   49088 ssh_runner.go:195] Run: cat /version.json
	I1202 19:07:21.831895   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.832169   49088 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 19:07:21.832229   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:21.852617   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.855772   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:21.955939   49088 command_runner.go:130] > {"iso_version": "v1.37.0-1763503576-21924", "kicbase_version": "v0.0.48-1764169655-21974", "minikube_version": "v1.37.0", "commit": "5499406178e21d60d74d327c9716de794e8a4797"}
	I1202 19:07:21.956090   49088 ssh_runner.go:195] Run: systemctl --version
	I1202 19:07:22.048548   49088 command_runner.go:130] > <a href="https://github.com/kubernetes/registry.k8s.io">Temporary Redirect</a>.
	I1202 19:07:22.051368   49088 command_runner.go:130] > systemd 252 (252.39-1~deb12u1)
	I1202 19:07:22.051402   49088 command_runner.go:130] > +PAM +AUDIT +SELINUX +APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT +QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified
	I1202 19:07:22.051488   49088 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I1202 19:07:22.055900   49088 command_runner.go:130] ! stat: cannot statx '/etc/cni/net.d/*loopback.conf*': No such file or directory
	W1202 19:07:22.056072   49088 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 19:07:22.056144   49088 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 19:07:22.064483   49088 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1202 19:07:22.064507   49088 start.go:496] detecting cgroup driver to use...
	I1202 19:07:22.064540   49088 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 19:07:22.064608   49088 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 19:07:22.080944   49088 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 19:07:22.094328   49088 docker.go:218] disabling cri-docker service (if available) ...
	I1202 19:07:22.094412   49088 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 19:07:22.110538   49088 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 19:07:22.123916   49088 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 19:07:22.251555   49088 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 19:07:22.372403   49088 docker.go:234] disabling docker service ...
	I1202 19:07:22.372547   49088 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 19:07:22.390362   49088 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 19:07:22.404129   49088 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 19:07:22.527674   49088 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 19:07:22.641245   49088 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 19:07:22.654510   49088 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 19:07:22.669149   49088 command_runner.go:130] > runtime-endpoint: unix:///run/containerd/containerd.sock
	I1202 19:07:22.670616   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 19:07:22.680782   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 19:07:22.690619   49088 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 19:07:22.690690   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 19:07:22.700650   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:07:22.710637   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 19:07:22.720237   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:07:22.730375   49088 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 19:07:22.738458   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 19:07:22.747256   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 19:07:22.756269   49088 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 19:07:22.765824   49088 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 19:07:22.772632   49088 command_runner.go:130] > net.bridge.bridge-nf-call-iptables = 1
	I1202 19:07:22.773683   49088 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 19:07:22.781384   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:22.894036   49088 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 19:07:22.996092   49088 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 19:07:22.996190   49088 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 19:07:23.000049   49088 command_runner.go:130] >   File: /run/containerd/containerd.sock
	I1202 19:07:23.000075   49088 command_runner.go:130] >   Size: 0         	Blocks: 0          IO Block: 4096   socket
	I1202 19:07:23.000083   49088 command_runner.go:130] > Device: 0,72	Inode: 1611        Links: 1
	I1202 19:07:23.000090   49088 command_runner.go:130] > Access: (0660/srw-rw----)  Uid: (    0/    root)   Gid: (    0/    root)
	I1202 19:07:23.000119   49088 command_runner.go:130] > Access: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000134   49088 command_runner.go:130] > Modify: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000139   49088 command_runner.go:130] > Change: 2025-12-02 19:07:22.966112143 +0000
	I1202 19:07:23.000143   49088 command_runner.go:130] >  Birth: -
	I1202 19:07:23.000708   49088 start.go:564] Will wait 60s for crictl version
	I1202 19:07:23.000798   49088 ssh_runner.go:195] Run: which crictl
	I1202 19:07:23.004553   49088 command_runner.go:130] > /usr/local/bin/crictl
	I1202 19:07:23.004698   49088 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 19:07:23.031006   49088 command_runner.go:130] > Version:  0.1.0
	I1202 19:07:23.031142   49088 command_runner.go:130] > RuntimeName:  containerd
	I1202 19:07:23.031156   49088 command_runner.go:130] > RuntimeVersion:  v2.1.5
	I1202 19:07:23.031165   49088 command_runner.go:130] > RuntimeApiVersion:  v1
	I1202 19:07:23.033497   49088 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 19:07:23.033588   49088 ssh_runner.go:195] Run: containerd --version
	I1202 19:07:23.053512   49088 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1202 19:07:23.055064   49088 ssh_runner.go:195] Run: containerd --version
	I1202 19:07:23.073280   49088 command_runner.go:130] > containerd containerd.io v2.1.5 fcd43222d6b07379a4be9786bda52438f0dd16a1
	I1202 19:07:23.080684   49088 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 19:07:23.083736   49088 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 19:07:23.100485   49088 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1202 19:07:23.104603   49088 command_runner.go:130] > 192.168.49.1	host.minikube.internal
	I1202 19:07:23.104709   49088 kubeadm.go:884] updating cluster {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:fal
se CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 19:07:23.104831   49088 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:07:23.104890   49088 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 19:07:23.127690   49088 command_runner.go:130] > {
	I1202 19:07:23.127710   49088 command_runner.go:130] >   "images":  [
	I1202 19:07:23.127715   49088 command_runner.go:130] >     {
	I1202 19:07:23.127725   49088 command_runner.go:130] >       "id":  "sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51",
	I1202 19:07:23.127729   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127744   49088 command_runner.go:130] >         "gcr.io/k8s-minikube/storage-provisioner:v5"
	I1202 19:07:23.127750   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127755   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127759   49088 command_runner.go:130] >       "size":  "8032639",
	I1202 19:07:23.127765   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127776   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127781   49088 command_runner.go:130] >     },
	I1202 19:07:23.127784   49088 command_runner.go:130] >     {
	I1202 19:07:23.127792   49088 command_runner.go:130] >       "id":  "sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf",
	I1202 19:07:23.127800   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127806   49088 command_runner.go:130] >         "registry.k8s.io/coredns/coredns:v1.13.1"
	I1202 19:07:23.127813   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127817   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127822   49088 command_runner.go:130] >       "size":  "21166088",
	I1202 19:07:23.127826   49088 command_runner.go:130] >       "username":  "nonroot",
	I1202 19:07:23.127832   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127835   49088 command_runner.go:130] >     },
	I1202 19:07:23.127838   49088 command_runner.go:130] >     {
	I1202 19:07:23.127845   49088 command_runner.go:130] >       "id":  "sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42",
	I1202 19:07:23.127855   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127869   49088 command_runner.go:130] >         "registry.k8s.io/etcd:3.6.5-0"
	I1202 19:07:23.127876   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127880   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127887   49088 command_runner.go:130] >       "size":  "21134420",
	I1202 19:07:23.127892   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.127899   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.127903   49088 command_runner.go:130] >       },
	I1202 19:07:23.127907   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127911   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127917   49088 command_runner.go:130] >     },
	I1202 19:07:23.127919   49088 command_runner.go:130] >     {
	I1202 19:07:23.127926   49088 command_runner.go:130] >       "id":  "sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4",
	I1202 19:07:23.127930   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.127938   49088 command_runner.go:130] >         "registry.k8s.io/kube-apiserver:v1.35.0-beta.0"
	I1202 19:07:23.127945   49088 command_runner.go:130] >       ],
	I1202 19:07:23.127949   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.127953   49088 command_runner.go:130] >       "size":  "24676285",
	I1202 19:07:23.127961   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.127965   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.127971   49088 command_runner.go:130] >       },
	I1202 19:07:23.127975   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.127983   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.127987   49088 command_runner.go:130] >     },
	I1202 19:07:23.127996   49088 command_runner.go:130] >     {
	I1202 19:07:23.128002   49088 command_runner.go:130] >       "id":  "sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be",
	I1202 19:07:23.128006   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128012   49088 command_runner.go:130] >         "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"
	I1202 19:07:23.128015   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128019   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128026   49088 command_runner.go:130] >       "size":  "20658969",
	I1202 19:07:23.128029   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128033   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.128041   49088 command_runner.go:130] >       },
	I1202 19:07:23.128052   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128059   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128063   49088 command_runner.go:130] >     },
	I1202 19:07:23.128070   49088 command_runner.go:130] >     {
	I1202 19:07:23.128077   49088 command_runner.go:130] >       "id":  "sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904",
	I1202 19:07:23.128081   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128088   49088 command_runner.go:130] >         "registry.k8s.io/kube-proxy:v1.35.0-beta.0"
	I1202 19:07:23.128092   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128096   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128099   49088 command_runner.go:130] >       "size":  "22428165",
	I1202 19:07:23.128103   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128109   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128113   49088 command_runner.go:130] >     },
	I1202 19:07:23.128116   49088 command_runner.go:130] >     {
	I1202 19:07:23.128123   49088 command_runner.go:130] >       "id":  "sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b",
	I1202 19:07:23.128130   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128135   49088 command_runner.go:130] >         "registry.k8s.io/kube-scheduler:v1.35.0-beta.0"
	I1202 19:07:23.128143   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128152   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128160   49088 command_runner.go:130] >       "size":  "15389290",
	I1202 19:07:23.128163   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128167   49088 command_runner.go:130] >         "value":  "0"
	I1202 19:07:23.128170   49088 command_runner.go:130] >       },
	I1202 19:07:23.128175   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128179   49088 command_runner.go:130] >       "pinned":  false
	I1202 19:07:23.128185   49088 command_runner.go:130] >     },
	I1202 19:07:23.128188   49088 command_runner.go:130] >     {
	I1202 19:07:23.128199   49088 command_runner.go:130] >       "id":  "sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd",
	I1202 19:07:23.128203   49088 command_runner.go:130] >       "repoTags":  [
	I1202 19:07:23.128212   49088 command_runner.go:130] >         "registry.k8s.io/pause:3.10.1"
	I1202 19:07:23.128215   49088 command_runner.go:130] >       ],
	I1202 19:07:23.128223   49088 command_runner.go:130] >       "repoDigests":  [],
	I1202 19:07:23.128227   49088 command_runner.go:130] >       "size":  "265458",
	I1202 19:07:23.128238   49088 command_runner.go:130] >       "uid":  {
	I1202 19:07:23.128243   49088 command_runner.go:130] >         "value":  "65535"
	I1202 19:07:23.128248   49088 command_runner.go:130] >       },
	I1202 19:07:23.128252   49088 command_runner.go:130] >       "username":  "",
	I1202 19:07:23.128256   49088 command_runner.go:130] >       "pinned":  true
	I1202 19:07:23.128259   49088 command_runner.go:130] >     }
	I1202 19:07:23.128262   49088 command_runner.go:130] >   ]
	I1202 19:07:23.128265   49088 command_runner.go:130] > }
	I1202 19:07:23.130379   49088 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 19:07:23.130403   49088 cache_images.go:86] Images are preloaded, skipping loading
	I1202 19:07:23.130410   49088 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1202 19:07:23.130509   49088 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-449836 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 19:07:23.130576   49088 ssh_runner.go:195] Run: sudo crictl info
	I1202 19:07:23.152707   49088 command_runner.go:130] > {
	I1202 19:07:23.152731   49088 command_runner.go:130] >   "cniconfig": {
	I1202 19:07:23.152737   49088 command_runner.go:130] >     "Networks": [
	I1202 19:07:23.152741   49088 command_runner.go:130] >       {
	I1202 19:07:23.152746   49088 command_runner.go:130] >         "Config": {
	I1202 19:07:23.152752   49088 command_runner.go:130] >           "CNIVersion": "0.3.1",
	I1202 19:07:23.152758   49088 command_runner.go:130] >           "Name": "cni-loopback",
	I1202 19:07:23.152768   49088 command_runner.go:130] >           "Plugins": [
	I1202 19:07:23.152775   49088 command_runner.go:130] >             {
	I1202 19:07:23.152779   49088 command_runner.go:130] >               "Network": {
	I1202 19:07:23.152784   49088 command_runner.go:130] >                 "ipam": {},
	I1202 19:07:23.152789   49088 command_runner.go:130] >                 "type": "loopback"
	I1202 19:07:23.152798   49088 command_runner.go:130] >               },
	I1202 19:07:23.152803   49088 command_runner.go:130] >               "Source": "{\"type\":\"loopback\"}"
	I1202 19:07:23.152810   49088 command_runner.go:130] >             }
	I1202 19:07:23.152814   49088 command_runner.go:130] >           ],
	I1202 19:07:23.152828   49088 command_runner.go:130] >           "Source": "{\n\"cniVersion\": \"0.3.1\",\n\"name\": \"cni-loopback\",\n\"plugins\": [{\n  \"type\": \"loopback\"\n}]\n}"
	I1202 19:07:23.152835   49088 command_runner.go:130] >         },
	I1202 19:07:23.152840   49088 command_runner.go:130] >         "IFName": "lo"
	I1202 19:07:23.152847   49088 command_runner.go:130] >       }
	I1202 19:07:23.152850   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152855   49088 command_runner.go:130] >     "PluginConfDir": "/etc/cni/net.d",
	I1202 19:07:23.152860   49088 command_runner.go:130] >     "PluginDirs": [
	I1202 19:07:23.152865   49088 command_runner.go:130] >       "/opt/cni/bin"
	I1202 19:07:23.152869   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152873   49088 command_runner.go:130] >     "PluginMaxConfNum": 1,
	I1202 19:07:23.152879   49088 command_runner.go:130] >     "Prefix": "eth"
	I1202 19:07:23.152883   49088 command_runner.go:130] >   },
	I1202 19:07:23.152891   49088 command_runner.go:130] >   "config": {
	I1202 19:07:23.152894   49088 command_runner.go:130] >     "cdiSpecDirs": [
	I1202 19:07:23.152898   49088 command_runner.go:130] >       "/etc/cdi",
	I1202 19:07:23.152907   49088 command_runner.go:130] >       "/var/run/cdi"
	I1202 19:07:23.152910   49088 command_runner.go:130] >     ],
	I1202 19:07:23.152917   49088 command_runner.go:130] >     "cni": {
	I1202 19:07:23.152921   49088 command_runner.go:130] >       "binDir": "",
	I1202 19:07:23.152928   49088 command_runner.go:130] >       "binDirs": [
	I1202 19:07:23.152933   49088 command_runner.go:130] >         "/opt/cni/bin"
	I1202 19:07:23.152936   49088 command_runner.go:130] >       ],
	I1202 19:07:23.152941   49088 command_runner.go:130] >       "confDir": "/etc/cni/net.d",
	I1202 19:07:23.152947   49088 command_runner.go:130] >       "confTemplate": "",
	I1202 19:07:23.152954   49088 command_runner.go:130] >       "ipPref": "",
	I1202 19:07:23.152958   49088 command_runner.go:130] >       "maxConfNum": 1,
	I1202 19:07:23.152963   49088 command_runner.go:130] >       "setupSerially": false,
	I1202 19:07:23.152969   49088 command_runner.go:130] >       "useInternalLoopback": false
	I1202 19:07:23.152977   49088 command_runner.go:130] >     },
	I1202 19:07:23.152983   49088 command_runner.go:130] >     "containerd": {
	I1202 19:07:23.152992   49088 command_runner.go:130] >       "defaultRuntimeName": "runc",
	I1202 19:07:23.152997   49088 command_runner.go:130] >       "ignoreBlockIONotEnabledErrors": false,
	I1202 19:07:23.153006   49088 command_runner.go:130] >       "ignoreRdtNotEnabledErrors": false,
	I1202 19:07:23.153010   49088 command_runner.go:130] >       "runtimes": {
	I1202 19:07:23.153017   49088 command_runner.go:130] >         "runc": {
	I1202 19:07:23.153022   49088 command_runner.go:130] >           "ContainerAnnotations": null,
	I1202 19:07:23.153026   49088 command_runner.go:130] >           "PodAnnotations": null,
	I1202 19:07:23.153031   49088 command_runner.go:130] >           "baseRuntimeSpec": "",
	I1202 19:07:23.153035   49088 command_runner.go:130] >           "cgroupWritable": false,
	I1202 19:07:23.153041   49088 command_runner.go:130] >           "cniConfDir": "",
	I1202 19:07:23.153046   49088 command_runner.go:130] >           "cniMaxConfNum": 0,
	I1202 19:07:23.153053   49088 command_runner.go:130] >           "io_type": "",
	I1202 19:07:23.153058   49088 command_runner.go:130] >           "options": {
	I1202 19:07:23.153066   49088 command_runner.go:130] >             "BinaryName": "",
	I1202 19:07:23.153071   49088 command_runner.go:130] >             "CriuImagePath": "",
	I1202 19:07:23.153079   49088 command_runner.go:130] >             "CriuWorkPath": "",
	I1202 19:07:23.153083   49088 command_runner.go:130] >             "IoGid": 0,
	I1202 19:07:23.153091   49088 command_runner.go:130] >             "IoUid": 0,
	I1202 19:07:23.153096   49088 command_runner.go:130] >             "NoNewKeyring": false,
	I1202 19:07:23.153100   49088 command_runner.go:130] >             "Root": "",
	I1202 19:07:23.153104   49088 command_runner.go:130] >             "ShimCgroup": "",
	I1202 19:07:23.153111   49088 command_runner.go:130] >             "SystemdCgroup": false
	I1202 19:07:23.153115   49088 command_runner.go:130] >           },
	I1202 19:07:23.153120   49088 command_runner.go:130] >           "privileged_without_host_devices": false,
	I1202 19:07:23.153128   49088 command_runner.go:130] >           "privileged_without_host_devices_all_devices_allowed": false,
	I1202 19:07:23.153136   49088 command_runner.go:130] >           "runtimePath": "",
	I1202 19:07:23.153143   49088 command_runner.go:130] >           "runtimeType": "io.containerd.runc.v2",
	I1202 19:07:23.153237   49088 command_runner.go:130] >           "sandboxer": "podsandbox",
	I1202 19:07:23.153375   49088 command_runner.go:130] >           "snapshotter": ""
	I1202 19:07:23.153385   49088 command_runner.go:130] >         }
	I1202 19:07:23.153389   49088 command_runner.go:130] >       }
	I1202 19:07:23.153393   49088 command_runner.go:130] >     },
	I1202 19:07:23.153414   49088 command_runner.go:130] >     "containerdEndpoint": "/run/containerd/containerd.sock",
	I1202 19:07:23.153424   49088 command_runner.go:130] >     "containerdRootDir": "/var/lib/containerd",
	I1202 19:07:23.153435   49088 command_runner.go:130] >     "device_ownership_from_security_context": false,
	I1202 19:07:23.153444   49088 command_runner.go:130] >     "disableApparmor": false,
	I1202 19:07:23.153449   49088 command_runner.go:130] >     "disableHugetlbController": true,
	I1202 19:07:23.153457   49088 command_runner.go:130] >     "disableProcMount": false,
	I1202 19:07:23.153467   49088 command_runner.go:130] >     "drainExecSyncIOTimeout": "0s",
	I1202 19:07:23.153475   49088 command_runner.go:130] >     "enableCDI": true,
	I1202 19:07:23.153479   49088 command_runner.go:130] >     "enableSelinux": false,
	I1202 19:07:23.153484   49088 command_runner.go:130] >     "enableUnprivilegedICMP": true,
	I1202 19:07:23.153490   49088 command_runner.go:130] >     "enableUnprivilegedPorts": true,
	I1202 19:07:23.153500   49088 command_runner.go:130] >     "ignoreDeprecationWarnings": null,
	I1202 19:07:23.153508   49088 command_runner.go:130] >     "ignoreImageDefinedVolumes": false,
	I1202 19:07:23.153516   49088 command_runner.go:130] >     "maxContainerLogLineSize": 16384,
	I1202 19:07:23.153522   49088 command_runner.go:130] >     "netnsMountsUnderStateDir": false,
	I1202 19:07:23.153534   49088 command_runner.go:130] >     "restrictOOMScoreAdj": false,
	I1202 19:07:23.153544   49088 command_runner.go:130] >     "rootDir": "/var/lib/containerd/io.containerd.grpc.v1.cri",
	I1202 19:07:23.153549   49088 command_runner.go:130] >     "selinuxCategoryRange": 1024,
	I1202 19:07:23.153562   49088 command_runner.go:130] >     "stateDir": "/run/containerd/io.containerd.grpc.v1.cri",
	I1202 19:07:23.153570   49088 command_runner.go:130] >     "tolerateMissingHugetlbController": true,
	I1202 19:07:23.153575   49088 command_runner.go:130] >     "unsetSeccompProfile": ""
	I1202 19:07:23.153578   49088 command_runner.go:130] >   },
	I1202 19:07:23.153582   49088 command_runner.go:130] >   "features": {
	I1202 19:07:23.153588   49088 command_runner.go:130] >     "supplemental_groups_policy": true
	I1202 19:07:23.153597   49088 command_runner.go:130] >   },
	I1202 19:07:23.153605   49088 command_runner.go:130] >   "golang": "go1.24.9",
	I1202 19:07:23.153615   49088 command_runner.go:130] >   "lastCNILoadStatus": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1202 19:07:23.153633   49088 command_runner.go:130] >   "lastCNILoadStatus.default": "cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config",
	I1202 19:07:23.153644   49088 command_runner.go:130] >   "runtimeHandlers": [
	I1202 19:07:23.153649   49088 command_runner.go:130] >     {
	I1202 19:07:23.153658   49088 command_runner.go:130] >       "features": {
	I1202 19:07:23.153664   49088 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1202 19:07:23.153669   49088 command_runner.go:130] >         "user_namespaces": true
	I1202 19:07:23.153675   49088 command_runner.go:130] >       }
	I1202 19:07:23.153679   49088 command_runner.go:130] >     },
	I1202 19:07:23.153686   49088 command_runner.go:130] >     {
	I1202 19:07:23.153691   49088 command_runner.go:130] >       "features": {
	I1202 19:07:23.153703   49088 command_runner.go:130] >         "recursive_read_only_mounts": true,
	I1202 19:07:23.153708   49088 command_runner.go:130] >         "user_namespaces": true
	I1202 19:07:23.153715   49088 command_runner.go:130] >       },
	I1202 19:07:23.153720   49088 command_runner.go:130] >       "name": "runc"
	I1202 19:07:23.153727   49088 command_runner.go:130] >     }
	I1202 19:07:23.153731   49088 command_runner.go:130] >   ],
	I1202 19:07:23.153738   49088 command_runner.go:130] >   "status": {
	I1202 19:07:23.153742   49088 command_runner.go:130] >     "conditions": [
	I1202 19:07:23.153746   49088 command_runner.go:130] >       {
	I1202 19:07:23.153751   49088 command_runner.go:130] >         "message": "",
	I1202 19:07:23.153757   49088 command_runner.go:130] >         "reason": "",
	I1202 19:07:23.153766   49088 command_runner.go:130] >         "status": true,
	I1202 19:07:23.153774   49088 command_runner.go:130] >         "type": "RuntimeReady"
	I1202 19:07:23.153781   49088 command_runner.go:130] >       },
	I1202 19:07:23.153785   49088 command_runner.go:130] >       {
	I1202 19:07:23.153792   49088 command_runner.go:130] >         "message": "Network plugin returns error: cni plugin not initialized",
	I1202 19:07:23.153797   49088 command_runner.go:130] >         "reason": "NetworkPluginNotReady",
	I1202 19:07:23.153805   49088 command_runner.go:130] >         "status": false,
	I1202 19:07:23.153810   49088 command_runner.go:130] >         "type": "NetworkReady"
	I1202 19:07:23.153814   49088 command_runner.go:130] >       },
	I1202 19:07:23.153820   49088 command_runner.go:130] >       {
	I1202 19:07:23.153824   49088 command_runner.go:130] >         "message": "",
	I1202 19:07:23.153828   49088 command_runner.go:130] >         "reason": "",
	I1202 19:07:23.153836   49088 command_runner.go:130] >         "status": true,
	I1202 19:07:23.153850   49088 command_runner.go:130] >         "type": "ContainerdHasNoDeprecationWarnings"
	I1202 19:07:23.153857   49088 command_runner.go:130] >       }
	I1202 19:07:23.153861   49088 command_runner.go:130] >     ]
	I1202 19:07:23.153868   49088 command_runner.go:130] >   }
	I1202 19:07:23.153871   49088 command_runner.go:130] > }
	I1202 19:07:23.157283   49088 cni.go:84] Creating CNI manager for ""
	I1202 19:07:23.157307   49088 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:07:23.157324   49088 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 19:07:23.157352   49088 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-449836 NodeName:functional-449836 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt Sta
ticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 19:07:23.157503   49088 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-449836"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 19:07:23.157589   49088 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 19:07:23.165274   49088 command_runner.go:130] > kubeadm
	I1202 19:07:23.165296   49088 command_runner.go:130] > kubectl
	I1202 19:07:23.165301   49088 command_runner.go:130] > kubelet
	I1202 19:07:23.166244   49088 binaries.go:51] Found k8s binaries, skipping transfer
	I1202 19:07:23.166309   49088 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 19:07:23.176520   49088 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1202 19:07:23.191534   49088 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 19:07:23.207596   49088 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2237 bytes)
	I1202 19:07:23.221899   49088 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1202 19:07:23.225538   49088 command_runner.go:130] > 192.168.49.2	control-plane.minikube.internal
	I1202 19:07:23.225972   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:23.344071   49088 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:07:24.171449   49088 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836 for IP: 192.168.49.2
	I1202 19:07:24.171473   49088 certs.go:195] generating shared ca certs ...
	I1202 19:07:24.171491   49088 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.171633   49088 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 19:07:24.171683   49088 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 19:07:24.171697   49088 certs.go:257] generating profile certs ...
	I1202 19:07:24.171794   49088 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key
	I1202 19:07:24.171860   49088 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da
	I1202 19:07:24.171905   49088 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key
	I1202 19:07:24.171916   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I1202 19:07:24.171929   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I1202 19:07:24.171946   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I1202 19:07:24.171957   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I1202 19:07:24.171972   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I1202 19:07:24.171985   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I1202 19:07:24.172001   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I1202 19:07:24.172012   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I1202 19:07:24.172062   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 19:07:24.172113   49088 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 19:07:24.172126   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 19:07:24.172154   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 19:07:24.172189   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 19:07:24.172215   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 19:07:24.172266   49088 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:07:24.172298   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.172314   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.172347   49088 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem -> /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.172878   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 19:07:24.192840   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 19:07:24.210709   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 19:07:24.228270   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 19:07:24.246519   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1202 19:07:24.264649   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 19:07:24.283289   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 19:07:24.302316   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 19:07:24.320907   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 19:07:24.338895   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 19:07:24.356995   49088 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 19:07:24.374784   49088 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 19:07:24.388173   49088 ssh_runner.go:195] Run: openssl version
	I1202 19:07:24.394457   49088 command_runner.go:130] > OpenSSL 3.0.17 1 Jul 2025 (Library: OpenSSL 3.0.17 1 Jul 2025)
	I1202 19:07:24.394840   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 19:07:24.403512   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407229   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407385   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.407455   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 19:07:24.448501   49088 command_runner.go:130] > 3ec20f2e
	I1202 19:07:24.448942   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 19:07:24.456981   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 19:07:24.465478   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469306   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469374   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.469438   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:07:24.510270   49088 command_runner.go:130] > b5213941
	I1202 19:07:24.510784   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 19:07:24.518790   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 19:07:24.527001   49088 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.530919   49088 command_runner.go:130] > -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.530959   49088 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.531008   49088 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 19:07:24.571727   49088 command_runner.go:130] > 51391683
	I1202 19:07:24.572161   49088 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 19:07:24.580157   49088 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:07:24.584062   49088 command_runner.go:130] >   File: /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:07:24.584087   49088 command_runner.go:130] >   Size: 1176      	Blocks: 8          IO Block: 4096   regular file
	I1202 19:07:24.584094   49088 command_runner.go:130] > Device: 259,1	Inode: 848916      Links: 1
	I1202 19:07:24.584101   49088 command_runner.go:130] > Access: (0644/-rw-r--r--)  Uid: (    0/    root)   Gid: (    0/    root)
	I1202 19:07:24.584108   49088 command_runner.go:130] > Access: 2025-12-02 19:03:16.577964732 +0000
	I1202 19:07:24.584114   49088 command_runner.go:130] > Modify: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584119   49088 command_runner.go:130] > Change: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584125   49088 command_runner.go:130] >  Birth: 2025-12-02 18:59:11.965818380 +0000
	I1202 19:07:24.584207   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1202 19:07:24.630311   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.630810   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1202 19:07:24.671995   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.672412   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1202 19:07:24.713648   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.713758   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1202 19:07:24.754977   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.755077   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1202 19:07:24.800995   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.801486   49088 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1202 19:07:24.844718   49088 command_runner.go:130] > Certificate will not expire
	I1202 19:07:24.845325   49088 kubeadm.go:401] StartCluster: {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false
CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:07:24.845410   49088 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 19:07:24.845499   49088 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:07:24.875465   49088 cri.go:89] found id: ""
	I1202 19:07:24.875565   49088 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 19:07:24.882887   49088 command_runner.go:130] > /var/lib/kubelet/config.yaml
	I1202 19:07:24.882908   49088 command_runner.go:130] > /var/lib/kubelet/kubeadm-flags.env
	I1202 19:07:24.882928   49088 command_runner.go:130] > /var/lib/minikube/etcd:
	I1202 19:07:24.883961   49088 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1202 19:07:24.884012   49088 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1202 19:07:24.884084   49088 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1202 19:07:24.891632   49088 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:07:24.892026   49088 kubeconfig.go:47] verify endpoint returned: get endpoint: "functional-449836" does not appear in /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.892129   49088 kubeconfig.go:62] /home/jenkins/minikube-integration/22021-2487/kubeconfig needs updating (will repair): [kubeconfig missing "functional-449836" cluster setting kubeconfig missing "functional-449836" context setting]
	I1202 19:07:24.892546   49088 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.892988   49088 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.893140   49088 kapi.go:59] client config for functional-449836: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 19:07:24.893652   49088 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1202 19:07:24.893721   49088 cert_rotation.go:141] "Starting client certificate rotation controller" logger="tls-transport-cache"
	I1202 19:07:24.893742   49088 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1202 19:07:24.893817   49088 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1202 19:07:24.893840   49088 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1202 19:07:24.893879   49088 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1202 19:07:24.894204   49088 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1202 19:07:24.902267   49088 kubeadm.go:635] The running cluster does not require reconfiguration: 192.168.49.2
	I1202 19:07:24.902298   49088 kubeadm.go:602] duration metric: took 18.265587ms to restartPrimaryControlPlane
	I1202 19:07:24.902309   49088 kubeadm.go:403] duration metric: took 56.993765ms to StartCluster
	I1202 19:07:24.902355   49088 settings.go:142] acquiring lock: {Name:mka76ea0dcf16fdbb68808885f8360c0083029b4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.902437   49088 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.903036   49088 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:07:24.903251   49088 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1202 19:07:24.903573   49088 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:07:24.903617   49088 addons.go:527] enable addons start: toEnable=map[ambassador:false amd-gpu-device-plugin:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubetail:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I1202 19:07:24.903676   49088 addons.go:70] Setting storage-provisioner=true in profile "functional-449836"
	I1202 19:07:24.903691   49088 addons.go:239] Setting addon storage-provisioner=true in "functional-449836"
	I1202 19:07:24.903717   49088 host.go:66] Checking if "functional-449836" exists ...
	I1202 19:07:24.903830   49088 addons.go:70] Setting default-storageclass=true in profile "functional-449836"
	I1202 19:07:24.903877   49088 addons_storage_classes.go:34] enableOrDisableStorageClasses default-storageclass=true on "functional-449836"
	I1202 19:07:24.904207   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.904250   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.909664   49088 out.go:179] * Verifying Kubernetes components...
	I1202 19:07:24.912752   49088 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:07:24.942660   49088 out.go:179]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 19:07:24.943205   49088 loader.go:402] Config loaded from file:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:07:24.943381   49088 kapi.go:59] client config for functional-449836: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextP
rotos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 19:07:24.943666   49088 addons.go:239] Setting addon default-storageclass=true in "functional-449836"
	I1202 19:07:24.943695   49088 host.go:66] Checking if "functional-449836" exists ...
	I1202 19:07:24.944105   49088 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:07:24.945588   49088 addons.go:436] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:24.945617   49088 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I1202 19:07:24.945676   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:24.976744   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:24.983018   49088 addons.go:436] installing /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:24.983040   49088 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I1202 19:07:24.983109   49088 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:07:25.013238   49088 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:07:25.139303   49088 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:07:25.147308   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:25.166870   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:25.922715   49088 node_ready.go:35] waiting up to 6m0s for node "functional-449836" to be "Ready" ...
	I1202 19:07:25.922842   49088 type.go:168] "Request Body" body=""
	I1202 19:07:25.922904   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:25.923137   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:25.923161   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923181   49088 retry.go:31] will retry after 314.802872ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923212   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:25.923227   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923235   49088 retry.go:31] will retry after 316.161686ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:25.923297   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:26.238968   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:26.239458   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:26.312262   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.312301   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.312346   49088 retry.go:31] will retry after 358.686092ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.320393   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.320484   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.320525   49088 retry.go:31] will retry after 528.121505ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.423804   49088 type.go:168] "Request Body" body=""
	I1202 19:07:26.423895   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:26.424214   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:26.671815   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:26.745439   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.745497   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.745515   49088 retry.go:31] will retry after 446.477413ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.849789   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:26.909069   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:26.909108   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.909134   49088 retry.go:31] will retry after 684.877567ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:26.923341   49088 type.go:168] "Request Body" body=""
	I1202 19:07:26.923433   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:26.923791   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:27.192236   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:27.247207   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:27.250502   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.250546   49088 retry.go:31] will retry after 797.707708ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.423790   49088 type.go:168] "Request Body" body=""
	I1202 19:07:27.423889   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:27.424196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:27.594774   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:27.660877   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:27.660957   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.660987   49088 retry.go:31] will retry after 601.48037ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:27.923401   49088 type.go:168] "Request Body" body=""
	I1202 19:07:27.923475   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:27.923784   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:27.923848   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:28.049160   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:28.112455   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:28.112493   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.112512   49088 retry.go:31] will retry after 941.564206ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.262919   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:28.323250   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:28.323307   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.323325   49088 retry.go:31] will retry after 741.834409ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:28.423555   49088 type.go:168] "Request Body" body=""
	I1202 19:07:28.423652   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:28.423971   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:28.923658   49088 type.go:168] "Request Body" body=""
	I1202 19:07:28.923731   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:28.923989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:29.054311   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:29.065740   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:29.126744   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:29.126791   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.126812   49088 retry.go:31] will retry after 2.378740888s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.143543   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:29.143609   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.143631   49088 retry.go:31] will retry after 2.739062704s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:29.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:07:29.423043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:29.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:29.923116   49088 type.go:168] "Request Body" body=""
	I1202 19:07:29.923203   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:29.923554   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:30.422925   49088 type.go:168] "Request Body" body=""
	I1202 19:07:30.423004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:30.423294   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:30.423351   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:30.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:07:30.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:30.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:07:31.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:31.423376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.506668   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:31.565098   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:31.565149   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.565168   49088 retry.go:31] will retry after 3.30231188s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.883619   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:31.923118   49088 type.go:168] "Request Body" body=""
	I1202 19:07:31.923190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:31.923483   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:31.949881   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:31.953682   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:31.953716   49088 retry.go:31] will retry after 2.323480137s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:32.422997   49088 type.go:168] "Request Body" body=""
	I1202 19:07:32.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:32.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:32.423441   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:32.923115   49088 type.go:168] "Request Body" body=""
	I1202 19:07:32.923193   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:32.923525   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:33.422891   49088 type.go:168] "Request Body" body=""
	I1202 19:07:33.422956   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:33.423209   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:33.922911   49088 type.go:168] "Request Body" body=""
	I1202 19:07:33.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:33.923302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:34.277557   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:34.337253   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:34.337306   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.337326   49088 retry.go:31] will retry after 5.941517157s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.423656   49088 type.go:168] "Request Body" body=""
	I1202 19:07:34.423738   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:34.424084   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:34.424136   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:34.867735   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:34.923406   49088 type.go:168] "Request Body" body=""
	I1202 19:07:34.923506   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:34.923762   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:34.931582   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:34.931622   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:34.931641   49088 retry.go:31] will retry after 5.732328972s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:35.423000   49088 type.go:168] "Request Body" body=""
	I1202 19:07:35.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:35.423385   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:35.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:07:35.923063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:35.923444   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:36.422994   49088 type.go:168] "Request Body" body=""
	I1202 19:07:36.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:36.423390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:36.922999   49088 type.go:168] "Request Body" body=""
	I1202 19:07:36.923077   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:36.923397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:36.923453   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:37.423120   49088 type.go:168] "Request Body" body=""
	I1202 19:07:37.423196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:37.423525   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:37.923076   49088 type.go:168] "Request Body" body=""
	I1202 19:07:37.923148   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:37.923450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:38.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:07:38.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:38.423432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:38.923000   49088 type.go:168] "Request Body" body=""
	I1202 19:07:38.923074   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:38.923410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:39.423757   49088 type.go:168] "Request Body" body=""
	I1202 19:07:39.423827   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:39.424076   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:39.424115   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:39.923939   49088 type.go:168] "Request Body" body=""
	I1202 19:07:39.924013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:39.924357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:40.279081   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:40.340610   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:40.340655   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.340674   49088 retry.go:31] will retry after 7.832295728s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:07:40.423959   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:40.424241   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:40.664676   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:40.720825   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:40.724043   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.724077   49088 retry.go:31] will retry after 3.410570548s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:40.923400   49088 type.go:168] "Request Body" body=""
	I1202 19:07:40.923497   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:40.923882   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:41.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:07:41.423784   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:41.424115   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:41.424172   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:41.922915   49088 type.go:168] "Request Body" body=""
	I1202 19:07:41.922990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:41.923344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:42.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:07:42.422980   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:42.423254   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:42.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:07:42.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:42.923341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:43.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:07:43.423067   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:43.423429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:43.923715   49088 type.go:168] "Request Body" body=""
	I1202 19:07:43.923780   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:43.924087   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:43.924145   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:44.135480   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:44.194407   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:44.194462   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:44.194482   49088 retry.go:31] will retry after 9.43511002s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:44.423808   49088 type.go:168] "Request Body" body=""
	I1202 19:07:44.423884   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:44.424207   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:44.923173   49088 type.go:168] "Request Body" body=""
	I1202 19:07:44.923287   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:44.923608   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:45.423511   49088 type.go:168] "Request Body" body=""
	I1202 19:07:45.423594   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:45.423852   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:45.923682   49088 type.go:168] "Request Body" body=""
	I1202 19:07:45.923754   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:45.924062   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:46.423867   49088 type.go:168] "Request Body" body=""
	I1202 19:07:46.423945   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:46.424267   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:46.424344   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:46.923022   49088 type.go:168] "Request Body" body=""
	I1202 19:07:46.923087   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:46.923335   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:47.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:07:47.423049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:47.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:47.922938   49088 type.go:168] "Request Body" body=""
	I1202 19:07:47.923018   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:47.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:48.173817   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:07:48.233696   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:48.233741   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:48.233760   49088 retry.go:31] will retry after 11.915058211s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:48.423860   49088 type.go:168] "Request Body" body=""
	I1202 19:07:48.423931   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:48.424192   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:48.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:07:48.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:48.923338   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:48.923389   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:49.423071   49088 type.go:168] "Request Body" body=""
	I1202 19:07:49.423160   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:49.423457   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:49.923767   49088 type.go:168] "Request Body" body=""
	I1202 19:07:49.923839   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:49.924094   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:50.423628   49088 type.go:168] "Request Body" body=""
	I1202 19:07:50.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:50.424008   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:50.923823   49088 type.go:168] "Request Body" body=""
	I1202 19:07:50.923896   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:50.924199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:50.924253   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:51.423846   49088 type.go:168] "Request Body" body=""
	I1202 19:07:51.423915   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:51.424234   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:51.922982   49088 type.go:168] "Request Body" body=""
	I1202 19:07:51.923118   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:51.923450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:52.423137   49088 type.go:168] "Request Body" body=""
	I1202 19:07:52.423209   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:52.423568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:52.923553   49088 type.go:168] "Request Body" body=""
	I1202 19:07:52.923629   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:52.923890   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:53.423671   49088 type.go:168] "Request Body" body=""
	I1202 19:07:53.423751   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:53.424089   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:53.424151   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:53.630602   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:07:53.701195   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:07:53.708777   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:53.708825   49088 retry.go:31] will retry after 18.228322251s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:07:53.923261   49088 type.go:168] "Request Body" body=""
	I1202 19:07:53.923336   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:53.923674   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:54.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:07:54.422976   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:54.423235   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:54.923162   49088 type.go:168] "Request Body" body=""
	I1202 19:07:54.923249   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:54.923575   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:55.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:07:55.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:55.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:55.922912   49088 type.go:168] "Request Body" body=""
	I1202 19:07:55.922984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:55.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:55.923346   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:56.422984   49088 type.go:168] "Request Body" body=""
	I1202 19:07:56.423058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:56.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:56.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:07:56.923062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:56.923429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:57.423124   49088 type.go:168] "Request Body" body=""
	I1202 19:07:57.423196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:57.423456   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:57.922920   49088 type.go:168] "Request Body" body=""
	I1202 19:07:57.922991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:57.923317   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:57.923373   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:07:58.422950   49088 type.go:168] "Request Body" body=""
	I1202 19:07:58.423033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:58.423366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:58.923630   49088 type.go:168] "Request Body" body=""
	I1202 19:07:58.923696   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:58.924020   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:59.423807   49088 type.go:168] "Request Body" body=""
	I1202 19:07:59.423887   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:59.424243   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:07:59.922942   49088 type.go:168] "Request Body" body=""
	I1202 19:07:59.923022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:07:59.923353   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:07:59.923410   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:00.150075   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:00.323059   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:00.323111   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:00.323132   49088 retry.go:31] will retry after 12.256345503s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:00.423512   49088 type.go:168] "Request Body" body=""
	I1202 19:08:00.423597   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:00.423977   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:00.923784   49088 type.go:168] "Request Body" body=""
	I1202 19:08:00.923865   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:00.924196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:01.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:08:01.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:01.423304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:01.922933   49088 type.go:168] "Request Body" body=""
	I1202 19:08:01.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:01.923287   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:02.422978   49088 type.go:168] "Request Body" body=""
	I1202 19:08:02.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:02.423379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:02.423436   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:02.923122   49088 type.go:168] "Request Body" body=""
	I1202 19:08:02.923245   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:02.923555   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:03.423814   49088 type.go:168] "Request Body" body=""
	I1202 19:08:03.423889   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:03.424141   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:03.923895   49088 type.go:168] "Request Body" body=""
	I1202 19:08:03.923996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:03.924288   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:04.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:08:04.423083   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:04.423375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:04.922988   49088 type.go:168] "Request Body" body=""
	I1202 19:08:04.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:04.923336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:04.923376   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:05.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:08:05.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:05.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:05.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:08:05.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:05.923359   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:06.423783   49088 type.go:168] "Request Body" body=""
	I1202 19:08:06.423854   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:06.424112   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:06.923877   49088 type.go:168] "Request Body" body=""
	I1202 19:08:06.923974   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:06.924303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:06.924381   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:07.423044   49088 type.go:168] "Request Body" body=""
	I1202 19:08:07.423125   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:07.423474   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:07.922856   49088 type.go:168] "Request Body" body=""
	I1202 19:08:07.922930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:07.923205   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:08.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:08.422992   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:08.423315   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:08.922886   49088 type.go:168] "Request Body" body=""
	I1202 19:08:08.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:08.923313   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:09.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:09.423006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:09.423295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:09.423343   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:09.922894   49088 type.go:168] "Request Body" body=""
	I1202 19:08:09.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:09.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:10.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:08:10.423153   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:10.423491   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:10.923741   49088 type.go:168] "Request Body" body=""
	I1202 19:08:10.923814   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:10.924073   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:11.423834   49088 type.go:168] "Request Body" body=""
	I1202 19:08:11.423907   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:11.424248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:11.424304   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:11.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:08:11.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:11.923342   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:11.937687   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:08:11.996748   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:11.999800   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:11.999828   49088 retry.go:31] will retry after 12.016513449s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.423502   49088 type.go:168] "Request Body" body=""
	I1202 19:08:12.423582   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:12.423831   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:12.580354   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:12.637408   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:12.637456   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.637477   49088 retry.go:31] will retry after 30.215930355s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:12.923948   49088 type.go:168] "Request Body" body=""
	I1202 19:08:12.924043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:12.924384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:13.422995   49088 type.go:168] "Request Body" body=""
	I1202 19:08:13.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:13.423402   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:13.923854   49088 type.go:168] "Request Body" body=""
	I1202 19:08:13.923924   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:13.924172   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:13.924221   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:14.422931   49088 type.go:168] "Request Body" body=""
	I1202 19:08:14.423013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:14.423360   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:14.923106   49088 type.go:168] "Request Body" body=""
	I1202 19:08:14.923201   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:14.923504   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:15.423455   49088 type.go:168] "Request Body" body=""
	I1202 19:08:15.423543   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:15.423801   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:15.923582   49088 type.go:168] "Request Body" body=""
	I1202 19:08:15.923658   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:15.923982   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:16.423696   49088 type.go:168] "Request Body" body=""
	I1202 19:08:16.423768   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:16.424069   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:16.424123   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:16.923441   49088 type.go:168] "Request Body" body=""
	I1202 19:08:16.923513   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:16.923823   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:17.423623   49088 type.go:168] "Request Body" body=""
	I1202 19:08:17.423715   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:17.424059   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:17.923916   49088 type.go:168] "Request Body" body=""
	I1202 19:08:17.923987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:17.924303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:18.422927   49088 type.go:168] "Request Body" body=""
	I1202 19:08:18.423013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:18.423293   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:18.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:08:18.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:18.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:18.923511   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:19.423204   49088 type.go:168] "Request Body" body=""
	I1202 19:08:19.423280   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:19.423633   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:19.923322   49088 type.go:168] "Request Body" body=""
	I1202 19:08:19.923392   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:19.923647   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:20.423776   49088 type.go:168] "Request Body" body=""
	I1202 19:08:20.423870   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:20.424201   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:20.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:08:20.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:20.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:21.423693   49088 type.go:168] "Request Body" body=""
	I1202 19:08:21.423801   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:21.424068   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:21.424117   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:21.923863   49088 type.go:168] "Request Body" body=""
	I1202 19:08:21.923935   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:21.924262   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:22.422993   49088 type.go:168] "Request Body" body=""
	I1202 19:08:22.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:22.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:22.922927   49088 type.go:168] "Request Body" body=""
	I1202 19:08:22.922998   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:22.923323   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:23.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:08:23.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:23.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:23.922971   49088 type.go:168] "Request Body" body=""
	I1202 19:08:23.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:23.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:23.923391   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:24.016567   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:08:24.078750   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:24.078790   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:24.078809   49088 retry.go:31] will retry after 37.473532818s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:24.423149   49088 type.go:168] "Request Body" body=""
	I1202 19:08:24.423225   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:24.423606   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:24.923585   49088 type.go:168] "Request Body" body=""
	I1202 19:08:24.923686   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:24.924015   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:25.423855   49088 type.go:168] "Request Body" body=""
	I1202 19:08:25.423933   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:25.424248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:25.923542   49088 type.go:168] "Request Body" body=""
	I1202 19:08:25.923615   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:25.923871   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:25.923923   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:26.423702   49088 type.go:168] "Request Body" body=""
	I1202 19:08:26.423799   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:26.424100   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:26.923908   49088 type.go:168] "Request Body" body=""
	I1202 19:08:26.923990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:26.924357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:27.422916   49088 type.go:168] "Request Body" body=""
	I1202 19:08:27.423000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:27.423295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:27.922995   49088 type.go:168] "Request Body" body=""
	I1202 19:08:27.923085   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:27.923386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:28.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:08:28.423198   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:28.423550   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:28.423605   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:28.923207   49088 type.go:168] "Request Body" body=""
	I1202 19:08:28.923280   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:28.923547   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:29.423229   49088 type.go:168] "Request Body" body=""
	I1202 19:08:29.423310   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:29.423621   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:29.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:08:29.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:29.923334   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:30.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:08:30.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:30.423290   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:30.922992   49088 type.go:168] "Request Body" body=""
	I1202 19:08:30.923070   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:30.923374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:30.923423   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:31.422962   49088 type.go:168] "Request Body" body=""
	I1202 19:08:31.423044   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:31.423370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:31.923658   49088 type.go:168] "Request Body" body=""
	I1202 19:08:31.923727   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:31.923984   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:32.423783   49088 type.go:168] "Request Body" body=""
	I1202 19:08:32.423853   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:32.424194   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:32.923864   49088 type.go:168] "Request Body" body=""
	I1202 19:08:32.923952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:32.924274   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:32.924361   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:33.422870   49088 type.go:168] "Request Body" body=""
	I1202 19:08:33.422935   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:33.423233   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:33.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:08:33.923021   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:33.923340   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:34.423053   49088 type.go:168] "Request Body" body=""
	I1202 19:08:34.423137   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:34.423440   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:34.923286   49088 type.go:168] "Request Body" body=""
	I1202 19:08:34.923353   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:34.923610   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:35.423738   49088 type.go:168] "Request Body" body=""
	I1202 19:08:35.423811   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:35.424122   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:35.424178   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:35.923982   49088 type.go:168] "Request Body" body=""
	I1202 19:08:35.924054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:35.924397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:36.423490   49088 type.go:168] "Request Body" body=""
	I1202 19:08:36.423577   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:36.423904   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:36.923609   49088 type.go:168] "Request Body" body=""
	I1202 19:08:36.923698   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:36.924003   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:37.423836   49088 type.go:168] "Request Body" body=""
	I1202 19:08:37.423908   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:37.424273   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:37.424347   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:37.923878   49088 type.go:168] "Request Body" body=""
	I1202 19:08:37.923949   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:37.924222   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:38.422922   49088 type.go:168] "Request Body" body=""
	I1202 19:08:38.422995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:38.423329   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:38.923060   49088 type.go:168] "Request Body" body=""
	I1202 19:08:38.923133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:38.923451   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:39.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:08:39.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:39.423240   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:39.922980   49088 type.go:168] "Request Body" body=""
	I1202 19:08:39.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:39.923354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:39.923403   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:40.423331   49088 type.go:168] "Request Body" body=""
	I1202 19:08:40.423423   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:40.423754   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:40.923541   49088 type.go:168] "Request Body" body=""
	I1202 19:08:40.923652   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:40.923952   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:41.423758   49088 type.go:168] "Request Body" body=""
	I1202 19:08:41.423829   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:41.424159   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:41.922904   49088 type.go:168] "Request Body" body=""
	I1202 19:08:41.922987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:41.923363   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:42.423568   49088 type.go:168] "Request Body" body=""
	I1202 19:08:42.423637   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:42.423879   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:42.423921   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:42.854609   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:08:42.913285   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:08:42.916268   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:42.916300   49088 retry.go:31] will retry after 24.794449401s: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	I1202 19:08:42.923470   49088 type.go:168] "Request Body" body=""
	I1202 19:08:42.923553   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:42.923860   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:43.423622   49088 type.go:168] "Request Body" body=""
	I1202 19:08:43.423694   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:43.423983   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:43.923751   49088 type.go:168] "Request Body" body=""
	I1202 19:08:43.923834   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:43.924123   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:44.422913   49088 type.go:168] "Request Body" body=""
	I1202 19:08:44.423014   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:44.423327   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:44.923006   49088 type.go:168] "Request Body" body=""
	I1202 19:08:44.923080   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:44.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:44.923476   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:45.422929   49088 type.go:168] "Request Body" body=""
	I1202 19:08:45.423000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:45.423274   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:45.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:08:45.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:45.923364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:46.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:08:46.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:46.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:46.922941   49088 type.go:168] "Request Body" body=""
	I1202 19:08:46.923013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:46.923277   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:47.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:08:47.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:47.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:47.423440   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:47.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:08:47.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:47.923383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:48.423521   49088 type.go:168] "Request Body" body=""
	I1202 19:08:48.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:48.424010   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:48.923782   49088 type.go:168] "Request Body" body=""
	I1202 19:08:48.923856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:48.924186   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:49.422919   49088 type.go:168] "Request Body" body=""
	I1202 19:08:49.422992   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:49.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:49.922931   49088 type.go:168] "Request Body" body=""
	I1202 19:08:49.923004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:49.923306   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:49.923362   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:50.423001   49088 type.go:168] "Request Body" body=""
	I1202 19:08:50.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:50.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:50.922993   49088 type.go:168] "Request Body" body=""
	I1202 19:08:50.923072   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:50.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:51.423082   49088 type.go:168] "Request Body" body=""
	I1202 19:08:51.423174   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:51.423497   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:51.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:08:51.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:51.923335   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:51.923382   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:52.423062   49088 type.go:168] "Request Body" body=""
	I1202 19:08:52.423141   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:52.423469   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:52.922906   49088 type.go:168] "Request Body" body=""
	I1202 19:08:52.922982   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:52.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:53.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:08:53.423074   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:53.423405   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:53.923177   49088 type.go:168] "Request Body" body=""
	I1202 19:08:53.923253   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:53.923577   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:53.923645   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:54.422880   49088 type.go:168] "Request Body" body=""
	I1202 19:08:54.422958   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:54.423220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:54.923069   49088 type.go:168] "Request Body" body=""
	I1202 19:08:54.923142   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:54.923466   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:55.423373   49088 type.go:168] "Request Body" body=""
	I1202 19:08:55.423459   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:55.423806   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:55.923603   49088 type.go:168] "Request Body" body=""
	I1202 19:08:55.923681   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:55.923944   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:55.923992   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:56.423750   49088 type.go:168] "Request Body" body=""
	I1202 19:08:56.423821   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:56.424196   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:56.922939   49088 type.go:168] "Request Body" body=""
	I1202 19:08:56.923015   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:56.923399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:57.423718   49088 type.go:168] "Request Body" body=""
	I1202 19:08:57.423789   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:57.424085   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:57.923903   49088 type.go:168] "Request Body" body=""
	I1202 19:08:57.923980   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:57.924302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:08:57.924374   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:08:58.422958   49088 type.go:168] "Request Body" body=""
	I1202 19:08:58.423030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:58.423367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:58.923777   49088 type.go:168] "Request Body" body=""
	I1202 19:08:58.923851   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:58.924127   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:59.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:08:59.423956   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:59.424305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:08:59.922904   49088 type.go:168] "Request Body" body=""
	I1202 19:08:59.922978   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:08:59.923298   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:00.423245   49088 type.go:168] "Request Body" body=""
	I1202 19:09:00.423318   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:00.423619   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:00.423665   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:00.922984   49088 type.go:168] "Request Body" body=""
	I1202 19:09:00.923063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:00.923384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:01.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:09:01.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:01.423390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:01.552630   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml
	I1202 19:09:01.616821   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:01.616872   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:01.616967   49088 out.go:285] ! Enabling 'storage-provisioner' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storage-provisioner.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storage-provisioner.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1202 19:09:01.923268   49088 type.go:168] "Request Body" body=""
	I1202 19:09:01.923333   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:01.923595   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:02.423036   49088 type.go:168] "Request Body" body=""
	I1202 19:09:02.423106   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:02.423425   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:02.922957   49088 type.go:168] "Request Body" body=""
	I1202 19:09:02.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:02.923349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:02.923411   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:03.422870   49088 type.go:168] "Request Body" body=""
	I1202 19:09:03.422937   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:03.423202   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:03.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:03.923048   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:03.923382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:04.422966   49088 type.go:168] "Request Body" body=""
	I1202 19:09:04.423045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:04.423360   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:04.923022   49088 type.go:168] "Request Body" body=""
	I1202 19:09:04.923097   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:04.923357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:05.423319   49088 type.go:168] "Request Body" body=""
	I1202 19:09:05.423392   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:05.423740   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:05.423793   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:05.923326   49088 type.go:168] "Request Body" body=""
	I1202 19:09:05.923409   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:05.923718   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:06.423454   49088 type.go:168] "Request Body" body=""
	I1202 19:09:06.423525   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:06.423826   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:06.923640   49088 type.go:168] "Request Body" body=""
	I1202 19:09:06.923716   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:06.924092   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:07.423755   49088 type.go:168] "Request Body" body=""
	I1202 19:09:07.423833   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:07.424174   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:07.424240   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:07.711667   49088 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml
	I1202 19:09:07.768083   49088 command_runner.go:130] ! error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:07.771273   49088 addons.go:477] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	W1202 19:09:07.771371   49088 out.go:285] ! Enabling 'default-storageclass' returned an error: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl apply --force -f /etc/kubernetes/addons/storageclass.yaml: Process exited with status 1
	stdout:
	
	stderr:
	error: error validating "/etc/kubernetes/addons/storageclass.yaml": error validating data: failed to download openapi: Get "https://localhost:8441/openapi/v2?timeout=32s": dial tcp [::1]:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false
	]
	I1202 19:09:07.774489   49088 out.go:179] * Enabled addons: 
	I1202 19:09:07.778178   49088 addons.go:530] duration metric: took 1m42.874553995s for enable addons: enabled=[]
	I1202 19:09:07.923592   49088 type.go:168] "Request Body" body=""
	I1202 19:09:07.923663   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:07.923975   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:08.423753   49088 type.go:168] "Request Body" body=""
	I1202 19:09:08.423867   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:08.424222   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:08.922931   49088 type.go:168] "Request Body" body=""
	I1202 19:09:08.923003   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:08.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:09.423790   49088 type.go:168] "Request Body" body=""
	I1202 19:09:09.423880   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:09.424153   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:09.922907   49088 type.go:168] "Request Body" body=""
	I1202 19:09:09.923001   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:09.923324   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:09.923374   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:10.423145   49088 type.go:168] "Request Body" body=""
	I1202 19:09:10.423260   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:10.423579   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:10.922932   49088 type.go:168] "Request Body" body=""
	I1202 19:09:10.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:10.923400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:11.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:09:11.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:11.423397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:11.923082   49088 type.go:168] "Request Body" body=""
	I1202 19:09:11.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:11.923464   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:11.923521   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:12.422896   49088 type.go:168] "Request Body" body=""
	I1202 19:09:12.422975   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:12.423250   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:12.922964   49088 type.go:168] "Request Body" body=""
	I1202 19:09:12.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:12.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:13.423093   49088 type.go:168] "Request Body" body=""
	I1202 19:09:13.423175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:13.423500   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:13.923207   49088 type.go:168] "Request Body" body=""
	I1202 19:09:13.923272   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:13.923535   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:13.923574   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:14.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:14.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:14.423377   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:14.923293   49088 type.go:168] "Request Body" body=""
	I1202 19:09:14.923367   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:14.923688   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:15.423514   49088 type.go:168] "Request Body" body=""
	I1202 19:09:15.423584   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:15.423882   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:15.923633   49088 type.go:168] "Request Body" body=""
	I1202 19:09:15.923702   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:15.924013   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:15.924083   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:16.423892   49088 type.go:168] "Request Body" body=""
	I1202 19:09:16.423994   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:16.424346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:16.922929   49088 type.go:168] "Request Body" body=""
	I1202 19:09:16.922996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:16.923246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:17.422959   49088 type.go:168] "Request Body" body=""
	I1202 19:09:17.423030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:17.423344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:17.923012   49088 type.go:168] "Request Body" body=""
	I1202 19:09:17.923112   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:17.923445   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:18.423579   49088 type.go:168] "Request Body" body=""
	I1202 19:09:18.423646   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:18.423912   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:18.423954   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:18.923689   49088 type.go:168] "Request Body" body=""
	I1202 19:09:18.923816   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:18.924164   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:19.423865   49088 type.go:168] "Request Body" body=""
	I1202 19:09:19.423938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:19.424264   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:19.922971   49088 type.go:168] "Request Body" body=""
	I1202 19:09:19.923065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:19.928773   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=5
	I1202 19:09:20.423719   49088 type.go:168] "Request Body" body=""
	I1202 19:09:20.423798   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:20.424108   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:20.424158   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:20.923797   49088 type.go:168] "Request Body" body=""
	I1202 19:09:20.923876   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:20.924234   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:21.423890   49088 type.go:168] "Request Body" body=""
	I1202 19:09:21.423990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:21.424292   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:21.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:09:21.923044   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:21.923382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:22.422969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:22.423043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:22.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:22.923067   49088 type.go:168] "Request Body" body=""
	I1202 19:09:22.923150   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:22.923449   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:22.923501   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:23.423230   49088 type.go:168] "Request Body" body=""
	I1202 19:09:23.423312   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:23.423745   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:23.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:23.923037   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:23.923364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:24.423610   49088 type.go:168] "Request Body" body=""
	I1202 19:09:24.423697   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:24.423973   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:24.922875   49088 type.go:168] "Request Body" body=""
	I1202 19:09:24.922950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:24.923296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:25.422971   49088 type.go:168] "Request Body" body=""
	I1202 19:09:25.423045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:25.423397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:25.423453   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:25.923781   49088 type.go:168] "Request Body" body=""
	I1202 19:09:25.923849   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:25.924111   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:26.423856   49088 type.go:168] "Request Body" body=""
	I1202 19:09:26.423928   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:26.424242   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:26.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:09:26.923039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:26.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:27.423069   49088 type.go:168] "Request Body" body=""
	I1202 19:09:27.423144   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:27.423407   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:27.922946   49088 type.go:168] "Request Body" body=""
	I1202 19:09:27.923018   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:27.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:27.923411   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:28.423076   49088 type.go:168] "Request Body" body=""
	I1202 19:09:28.423152   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:28.423495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:28.923840   49088 type.go:168] "Request Body" body=""
	I1202 19:09:28.923913   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:28.924219   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:29.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:09:29.422989   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:29.423326   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:29.923042   49088 type.go:168] "Request Body" body=""
	I1202 19:09:29.923119   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:29.923480   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:29.923538   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:30.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:09:30.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:30.423371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:30.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:30.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:30.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:31.423083   49088 type.go:168] "Request Body" body=""
	I1202 19:09:31.423155   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:31.423507   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:31.922881   49088 type.go:168] "Request Body" body=""
	I1202 19:09:31.922954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:31.923312   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:32.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:09:32.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:32.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:32.423472   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:32.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:09:32.923050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:32.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:33.423100   49088 type.go:168] "Request Body" body=""
	I1202 19:09:33.423172   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:33.423473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:33.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:09:33.923039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:33.923379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:34.423096   49088 type.go:168] "Request Body" body=""
	I1202 19:09:34.423177   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:34.423484   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:34.423532   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:34.923381   49088 type.go:168] "Request Body" body=""
	I1202 19:09:34.923452   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:34.923763   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:35.423619   49088 type.go:168] "Request Body" body=""
	I1202 19:09:35.423698   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:35.424059   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:35.923863   49088 type.go:168] "Request Body" body=""
	I1202 19:09:35.923933   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:35.924297   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:36.423817   49088 type.go:168] "Request Body" body=""
	I1202 19:09:36.423883   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:36.424153   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:36.424193   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:36.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:09:36.922964   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:36.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:37.422984   49088 type.go:168] "Request Body" body=""
	I1202 19:09:37.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:37.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:37.922914   49088 type.go:168] "Request Body" body=""
	I1202 19:09:37.922987   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:37.923253   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:38.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:09:38.423020   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:38.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:38.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:38.923084   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:38.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:38.923459   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:39.423118   49088 type.go:168] "Request Body" body=""
	I1202 19:09:39.423188   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:39.423443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:39.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:09:39.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:39.923390   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:40.422957   49088 type.go:168] "Request Body" body=""
	I1202 19:09:40.423031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:40.423438   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:40.922909   49088 type.go:168] "Request Body" body=""
	I1202 19:09:40.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:40.923328   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:41.422954   49088 type.go:168] "Request Body" body=""
	I1202 19:09:41.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:41.423387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:41.423450   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:41.923108   49088 type.go:168] "Request Body" body=""
	I1202 19:09:41.923187   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:41.923536   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:42.423214   49088 type.go:168] "Request Body" body=""
	I1202 19:09:42.423293   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:42.423567   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:42.922993   49088 type.go:168] "Request Body" body=""
	I1202 19:09:42.923073   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:42.923422   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:43.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:09:43.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:43.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:43.923097   49088 type.go:168] "Request Body" body=""
	I1202 19:09:43.923183   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:43.923554   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:43.923610   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:44.422960   49088 type.go:168] "Request Body" body=""
	I1202 19:09:44.423031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:44.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:44.923133   49088 type.go:168] "Request Body" body=""
	I1202 19:09:44.923217   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:44.923568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:45.422996   49088 type.go:168] "Request Body" body=""
	I1202 19:09:45.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:45.423325   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:45.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:09:45.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:45.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:46.422965   49088 type.go:168] "Request Body" body=""
	I1202 19:09:46.423041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:46.423338   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:46.423383   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:46.923662   49088 type.go:168] "Request Body" body=""
	I1202 19:09:46.923729   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:46.923996   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:47.423794   49088 type.go:168] "Request Body" body=""
	I1202 19:09:47.423868   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:47.424199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:47.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:09:47.922970   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:47.923290   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:48.423862   49088 type.go:168] "Request Body" body=""
	I1202 19:09:48.423930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:48.424197   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:48.424242   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:48.922882   49088 type.go:168] "Request Body" body=""
	I1202 19:09:48.922964   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:48.923305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:49.422874   49088 type.go:168] "Request Body" body=""
	I1202 19:09:49.422952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:49.423501   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:49.923227   49088 type.go:168] "Request Body" body=""
	I1202 19:09:49.923298   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:49.923571   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:50.423530   49088 type.go:168] "Request Body" body=""
	I1202 19:09:50.423605   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:50.423930   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:50.923709   49088 type.go:168] "Request Body" body=""
	I1202 19:09:50.923791   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:50.924129   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:50.924183   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:51.423574   49088 type.go:168] "Request Body" body=""
	I1202 19:09:51.423645   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:51.423989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:51.923773   49088 type.go:168] "Request Body" body=""
	I1202 19:09:51.923846   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:51.924175   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:52.423792   49088 type.go:168] "Request Body" body=""
	I1202 19:09:52.423865   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:52.424194   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:52.923791   49088 type.go:168] "Request Body" body=""
	I1202 19:09:52.923863   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:52.924133   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:53.423850   49088 type.go:168] "Request Body" body=""
	I1202 19:09:53.423929   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:53.424252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:53.424366   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:53.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:09:53.923047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:53.923376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:54.422922   49088 type.go:168] "Request Body" body=""
	I1202 19:09:54.422999   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:54.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:54.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:09:54.923175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:54.923548   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:55.422950   49088 type.go:168] "Request Body" body=""
	I1202 19:09:55.423041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:55.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:55.923676   49088 type.go:168] "Request Body" body=""
	I1202 19:09:55.923766   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:55.924024   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:55.924066   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:56.423823   49088 type.go:168] "Request Body" body=""
	I1202 19:09:56.423900   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:56.424217   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:56.922947   49088 type.go:168] "Request Body" body=""
	I1202 19:09:56.923022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:56.923366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:57.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:09:57.422984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:57.423240   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:57.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:09:57.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:57.923361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:58.422959   49088 type.go:168] "Request Body" body=""
	I1202 19:09:58.423033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:58.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:09:58.423443   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:09:58.923738   49088 type.go:168] "Request Body" body=""
	I1202 19:09:58.923812   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:58.924072   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:59.423859   49088 type.go:168] "Request Body" body=""
	I1202 19:09:59.423937   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:59.424270   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:09:59.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:09:59.923052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:09:59.923398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:00.435478   49088 type.go:168] "Request Body" body=""
	I1202 19:10:00.435562   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:00.435862   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:00.435913   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:00.923635   49088 type.go:168] "Request Body" body=""
	I1202 19:10:00.923715   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:00.924056   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:01.423705   49088 type.go:168] "Request Body" body=""
	I1202 19:10:01.423779   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:01.424081   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:01.923812   49088 type.go:168] "Request Body" body=""
	I1202 19:10:01.923878   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:01.924156   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:02.423889   49088 type.go:168] "Request Body" body=""
	I1202 19:10:02.423969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:02.424269   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:02.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:02.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:02.923443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:02.923500   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:03.423151   49088 type.go:168] "Request Body" body=""
	I1202 19:10:03.423226   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:03.423486   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:03.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:10:03.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:03.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:04.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:10:04.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:04.423412   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:04.923351   49088 type.go:168] "Request Body" body=""
	I1202 19:10:04.923425   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:04.923691   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:04.923736   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:05.423841   49088 type.go:168] "Request Body" body=""
	I1202 19:10:05.423932   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:05.424309   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:05.922992   49088 type.go:168] "Request Body" body=""
	I1202 19:10:05.923078   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:05.923444   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:06.423123   49088 type.go:168] "Request Body" body=""
	I1202 19:10:06.423232   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:06.423495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:06.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:10:06.923050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:06.923408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:07.422988   49088 type.go:168] "Request Body" body=""
	I1202 19:10:07.423069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:07.423489   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:07.423547   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:07.923028   49088 type.go:168] "Request Body" body=""
	I1202 19:10:07.923107   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:07.923442   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:08.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:08.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:08.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:08.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:10:08.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:08.923365   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:09.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:10:09.423094   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:09.423439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:09.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:10:09.923062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:09.923410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:09.923465   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:10.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:10:10.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:10.423387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:10.922955   49088 type.go:168] "Request Body" body=""
	I1202 19:10:10.923025   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:10.923302   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:11.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:11.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:11.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:11.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:10:11.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:11.923379   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:12.422929   49088 type.go:168] "Request Body" body=""
	I1202 19:10:12.423003   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:12.423285   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:12.423327   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:12.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:10:12.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:12.923388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:13.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:13.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:13.423417   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:13.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:10:13.922997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:13.923304   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:14.422948   49088 type.go:168] "Request Body" body=""
	I1202 19:10:14.423026   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:14.423361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:14.423418   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:14.923139   49088 type.go:168] "Request Body" body=""
	I1202 19:10:14.923221   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:14.923551   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:15.423337   49088 type.go:168] "Request Body" body=""
	I1202 19:10:15.423420   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:15.423733   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:15.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:10:15.923040   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:15.923380   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:16.422956   49088 type.go:168] "Request Body" body=""
	I1202 19:10:16.423036   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:16.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:16.923752   49088 type.go:168] "Request Body" body=""
	I1202 19:10:16.923818   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:16.924073   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:16.924112   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:17.423826   49088 type.go:168] "Request Body" body=""
	I1202 19:10:17.423915   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:17.424256   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:17.922988   49088 type.go:168] "Request Body" body=""
	I1202 19:10:17.923068   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:17.923403   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:18.422876   49088 type.go:168] "Request Body" body=""
	I1202 19:10:18.422953   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:18.423220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:18.922913   49088 type.go:168] "Request Body" body=""
	I1202 19:10:18.922988   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:18.923326   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:19.423037   49088 type.go:168] "Request Body" body=""
	I1202 19:10:19.423111   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:19.423450   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:19.423505   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:19.923780   49088 type.go:168] "Request Body" body=""
	I1202 19:10:19.923847   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:19.924112   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:20.423105   49088 type.go:168] "Request Body" body=""
	I1202 19:10:20.423212   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:20.423516   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:20.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:10:20.923060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:20.923378   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:21.423065   49088 type.go:168] "Request Body" body=""
	I1202 19:10:21.423140   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:21.423414   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:21.923020   49088 type.go:168] "Request Body" body=""
	I1202 19:10:21.923093   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:21.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:21.923415   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:22.423093   49088 type.go:168] "Request Body" body=""
	I1202 19:10:22.423166   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:22.423511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:22.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:10:22.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:22.923434   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:23.423060   49088 type.go:168] "Request Body" body=""
	I1202 19:10:23.423133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:23.423446   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:23.923195   49088 type.go:168] "Request Body" body=""
	I1202 19:10:23.923269   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:23.923577   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:23.923622   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:24.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:10:24.422957   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:24.423230   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:24.923115   49088 type.go:168] "Request Body" body=""
	I1202 19:10:24.923194   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:24.923600   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:25.423537   49088 type.go:168] "Request Body" body=""
	I1202 19:10:25.423610   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:25.423949   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:25.923703   49088 type.go:168] "Request Body" body=""
	I1202 19:10:25.923775   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:25.924043   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:25.924103   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:26.423902   49088 type.go:168] "Request Body" body=""
	I1202 19:10:26.423982   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:26.424292   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:26.922962   49088 type.go:168] "Request Body" body=""
	I1202 19:10:26.923073   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:26.923396   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:27.423706   49088 type.go:168] "Request Body" body=""
	I1202 19:10:27.423778   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:27.424090   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:27.923873   49088 type.go:168] "Request Body" body=""
	I1202 19:10:27.923954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:27.924307   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:27.924397   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:28.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:10:28.423017   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:28.423349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:28.922918   49088 type.go:168] "Request Body" body=""
	I1202 19:10:28.922988   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:28.923270   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:29.422990   49088 type.go:168] "Request Body" body=""
	I1202 19:10:29.423072   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:29.423426   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:29.922978   49088 type.go:168] "Request Body" body=""
	I1202 19:10:29.923053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:29.923386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:30.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:10:30.423008   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:30.423306   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:30.423392   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:30.923054   49088 type.go:168] "Request Body" body=""
	I1202 19:10:30.923133   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:30.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:31.423123   49088 type.go:168] "Request Body" body=""
	I1202 19:10:31.423203   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:31.423539   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:31.923853   49088 type.go:168] "Request Body" body=""
	I1202 19:10:31.923920   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:31.924180   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:32.422889   49088 type.go:168] "Request Body" body=""
	I1202 19:10:32.422970   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:32.423319   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:32.922910   49088 type.go:168] "Request Body" body=""
	I1202 19:10:32.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:32.923321   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:32.923375   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:33.423014   49088 type.go:168] "Request Body" body=""
	I1202 19:10:33.423095   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:33.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:33.923072   49088 type.go:168] "Request Body" body=""
	I1202 19:10:33.923148   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:33.923513   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:34.423108   49088 type.go:168] "Request Body" body=""
	I1202 19:10:34.423190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:34.423541   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:34.923286   49088 type.go:168] "Request Body" body=""
	I1202 19:10:34.923355   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:34.923630   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:34.923672   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:35.423752   49088 type.go:168] "Request Body" body=""
	I1202 19:10:35.423834   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:35.424190   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:35.922887   49088 type.go:168] "Request Body" body=""
	I1202 19:10:35.922967   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:35.923295   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:36.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:10:36.423054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:36.423305   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:36.923037   49088 type.go:168] "Request Body" body=""
	I1202 19:10:36.923115   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:36.923442   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:37.423029   49088 type.go:168] "Request Body" body=""
	I1202 19:10:37.423102   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:37.423425   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:37.423480   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:37.923773   49088 type.go:168] "Request Body" body=""
	I1202 19:10:37.923861   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:37.924136   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:38.423899   49088 type.go:168] "Request Body" body=""
	I1202 19:10:38.423979   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:38.424296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:38.922970   49088 type.go:168] "Request Body" body=""
	I1202 19:10:38.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:38.923372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:39.423713   49088 type.go:168] "Request Body" body=""
	I1202 19:10:39.423780   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:39.424040   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:39.424081   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:39.923835   49088 type.go:168] "Request Body" body=""
	I1202 19:10:39.923908   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:39.924227   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:40.423209   49088 type.go:168] "Request Body" body=""
	I1202 19:10:40.423286   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:40.423612   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:40.923000   49088 type.go:168] "Request Body" body=""
	I1202 19:10:40.923083   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:40.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:41.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:41.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:41.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:41.922974   49088 type.go:168] "Request Body" body=""
	I1202 19:10:41.923048   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:41.923367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:41.923430   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:42.422892   49088 type.go:168] "Request Body" body=""
	I1202 19:10:42.422960   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:42.423218   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:42.922977   49088 type.go:168] "Request Body" body=""
	I1202 19:10:42.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:42.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:43.422967   49088 type.go:168] "Request Body" body=""
	I1202 19:10:43.423039   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:43.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:43.922911   49088 type.go:168] "Request Body" body=""
	I1202 19:10:43.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:43.923286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:44.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:10:44.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:44.423388   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:44.423441   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:44.923165   49088 type.go:168] "Request Body" body=""
	I1202 19:10:44.923245   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:44.923572   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:45.423613   49088 type.go:168] "Request Body" body=""
	I1202 19:10:45.423695   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:45.423958   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:45.924133   49088 type.go:168] "Request Body" body=""
	I1202 19:10:45.924208   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:45.924557   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:46.423281   49088 type.go:168] "Request Body" body=""
	I1202 19:10:46.423365   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:46.423700   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:46.423760   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:46.923435   49088 type.go:168] "Request Body" body=""
	I1202 19:10:46.923504   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:46.923772   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:47.423542   49088 type.go:168] "Request Body" body=""
	I1202 19:10:47.423616   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:47.423946   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:47.923718   49088 type.go:168] "Request Body" body=""
	I1202 19:10:47.923790   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:47.924128   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:48.423446   49088 type.go:168] "Request Body" body=""
	I1202 19:10:48.423517   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:48.423772   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:48.423814   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:48.923540   49088 type.go:168] "Request Body" body=""
	I1202 19:10:48.923616   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:48.923948   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:49.423625   49088 type.go:168] "Request Body" body=""
	I1202 19:10:49.423703   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:49.424044   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:49.923749   49088 type.go:168] "Request Body" body=""
	I1202 19:10:49.923817   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:49.924118   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:50.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:10:50.423051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:50.423386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:50.923086   49088 type.go:168] "Request Body" body=""
	I1202 19:10:50.923163   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:50.923498   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:50.923558   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:51.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:10:51.422947   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:51.423236   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:51.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:51.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:51.923370   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:52.423002   49088 type.go:168] "Request Body" body=""
	I1202 19:10:52.423076   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:52.423410   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:52.922879   49088 type.go:168] "Request Body" body=""
	I1202 19:10:52.922948   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:52.923224   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:53.422960   49088 type.go:168] "Request Body" body=""
	I1202 19:10:53.423034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:53.423361   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:53.423412   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:53.923077   49088 type.go:168] "Request Body" body=""
	I1202 19:10:53.923157   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:53.923495   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:54.422917   49088 type.go:168] "Request Body" body=""
	I1202 19:10:54.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:54.423246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:54.923135   49088 type.go:168] "Request Body" body=""
	I1202 19:10:54.923208   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:54.923556   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:55.423559   49088 type.go:168] "Request Body" body=""
	I1202 19:10:55.423636   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:55.423971   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:55.424022   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:55.923605   49088 type.go:168] "Request Body" body=""
	I1202 19:10:55.923678   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:55.923946   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:56.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:10:56.423787   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:56.424129   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:56.923785   49088 type.go:168] "Request Body" body=""
	I1202 19:10:56.923856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:56.924173   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:57.423656   49088 type.go:168] "Request Body" body=""
	I1202 19:10:57.423734   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:57.423996   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:57.923693   49088 type.go:168] "Request Body" body=""
	I1202 19:10:57.923764   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:57.924078   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:10:57.924131   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:10:58.423895   49088 type.go:168] "Request Body" body=""
	I1202 19:10:58.423973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:58.424286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:58.922875   49088 type.go:168] "Request Body" body=""
	I1202 19:10:58.922950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:58.923200   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:59.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:10:59.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:59.423383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:10:59.922968   49088 type.go:168] "Request Body" body=""
	I1202 19:10:59.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:10:59.923368   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:00.423287   49088 type.go:168] "Request Body" body=""
	I1202 19:11:00.423360   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:00.423657   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:00.423700   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:00.922983   49088 type.go:168] "Request Body" body=""
	I1202 19:11:00.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:00.923393   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:01.423104   49088 type.go:168] "Request Body" body=""
	I1202 19:11:01.423186   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:01.423527   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:01.923028   49088 type.go:168] "Request Body" body=""
	I1202 19:11:01.923097   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:01.923355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:02.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:11:02.423036   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:02.423393   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:02.923093   49088 type.go:168] "Request Body" body=""
	I1202 19:11:02.923169   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:02.923483   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:02.923543   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:03.422923   49088 type.go:168] "Request Body" body=""
	I1202 19:11:03.422993   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:03.423275   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:03.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:03.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:03.923401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:04.423109   49088 type.go:168] "Request Body" body=""
	I1202 19:11:04.423183   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:04.423480   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:04.922963   49088 type.go:168] "Request Body" body=""
	I1202 19:11:04.923029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:04.923291   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:05.422943   49088 type.go:168] "Request Body" body=""
	I1202 19:11:05.423019   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:05.423416   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:05.423485   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:05.922975   49088 type.go:168] "Request Body" body=""
	I1202 19:11:05.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:05.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:06.423068   49088 type.go:168] "Request Body" body=""
	I1202 19:11:06.423143   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:06.423404   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:06.922954   49088 type.go:168] "Request Body" body=""
	I1202 19:11:06.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:06.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:07.423090   49088 type.go:168] "Request Body" body=""
	I1202 19:11:07.423173   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:07.423511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:07.423577   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:07.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:11:07.923059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:07.923477   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:08.423196   49088 type.go:168] "Request Body" body=""
	I1202 19:11:08.423268   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:08.423618   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:08.923317   49088 type.go:168] "Request Body" body=""
	I1202 19:11:08.923395   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:08.923714   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:09.423471   49088 type.go:168] "Request Body" body=""
	I1202 19:11:09.423536   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:09.423793   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:09.423831   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:09.923592   49088 type.go:168] "Request Body" body=""
	I1202 19:11:09.923672   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:09.923995   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:10.422883   49088 type.go:168] "Request Body" body=""
	I1202 19:11:10.422965   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:10.423316   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:10.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:10.923035   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:10.923371   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:11.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:11:11.423060   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:11.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:11.923095   49088 type.go:168] "Request Body" body=""
	I1202 19:11:11.923169   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:11.923521   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:11.923576   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:12.423858   49088 type.go:168] "Request Body" body=""
	I1202 19:11:12.423929   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:12.424221   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:12.922920   49088 type.go:168] "Request Body" body=""
	I1202 19:11:12.923007   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:12.923376   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:13.423103   49088 type.go:168] "Request Body" body=""
	I1202 19:11:13.423187   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:13.423573   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:13.923844   49088 type.go:168] "Request Body" body=""
	I1202 19:11:13.923913   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:13.924261   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:13.924357   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:14.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:11:14.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:14.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:14.923176   49088 type.go:168] "Request Body" body=""
	I1202 19:11:14.923259   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:14.923607   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:15.423538   49088 type.go:168] "Request Body" body=""
	I1202 19:11:15.423610   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:15.423918   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:15.923744   49088 type.go:168] "Request Body" body=""
	I1202 19:11:15.923820   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:15.924119   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:16.423897   49088 type.go:168] "Request Body" body=""
	I1202 19:11:16.423968   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:16.424281   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:16.424354   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:16.922924   49088 type.go:168] "Request Body" body=""
	I1202 19:11:16.923006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:16.923265   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:17.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:11:17.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:17.423367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:17.922982   49088 type.go:168] "Request Body" body=""
	I1202 19:11:17.923056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:17.923356   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:18.422877   49088 type.go:168] "Request Body" body=""
	I1202 19:11:18.422949   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:18.423201   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:18.922936   49088 type.go:168] "Request Body" body=""
	I1202 19:11:18.923013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:18.923346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:18.923404   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:19.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:11:19.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:19.423374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:19.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:11:19.923009   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:19.923288   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:20.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:11:20.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:20.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:20.923042   49088 type.go:168] "Request Body" body=""
	I1202 19:11:20.923120   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:20.923474   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:20.923528   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:21.423170   49088 type.go:168] "Request Body" body=""
	I1202 19:11:21.423246   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:21.423533   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:21.922987   49088 type.go:168] "Request Body" body=""
	I1202 19:11:21.923069   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:21.923428   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:22.423136   49088 type.go:168] "Request Body" body=""
	I1202 19:11:22.423218   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:22.423580   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:22.923816   49088 type.go:168] "Request Body" body=""
	I1202 19:11:22.923890   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:22.924148   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:22.924186   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:23.422866   49088 type.go:168] "Request Body" body=""
	I1202 19:11:23.422936   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:23.423286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:23.923262   49088 type.go:168] "Request Body" body=""
	I1202 19:11:23.923352   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:23.923994   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:24.422942   49088 type.go:168] "Request Body" body=""
	I1202 19:11:24.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:24.423374   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:24.922937   49088 type.go:168] "Request Body" body=""
	I1202 19:11:24.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:24.923429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:25.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:11:25.423008   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:25.423357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:25.423414   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:25.922926   49088 type.go:168] "Request Body" body=""
	I1202 19:11:25.922991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:25.923253   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:26.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:11:26.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:26.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:26.923094   49088 type.go:168] "Request Body" body=""
	I1202 19:11:26.923165   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:26.923469   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:27.422920   49088 type.go:168] "Request Body" body=""
	I1202 19:11:27.422991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:27.423278   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:27.922888   49088 type.go:168] "Request Body" body=""
	I1202 19:11:27.922961   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:27.923314   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:27.923368   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:28.422912   49088 type.go:168] "Request Body" body=""
	I1202 19:11:28.422990   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:28.423375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:28.923772   49088 type.go:168] "Request Body" body=""
	I1202 19:11:28.923838   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:28.924083   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:29.423850   49088 type.go:168] "Request Body" body=""
	I1202 19:11:29.423927   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:29.424260   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:29.923896   49088 type.go:168] "Request Body" body=""
	I1202 19:11:29.923968   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:29.924284   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:29.924358   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:30.423879   49088 type.go:168] "Request Body" body=""
	I1202 19:11:30.423953   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:30.424220   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:30.922930   49088 type.go:168] "Request Body" body=""
	I1202 19:11:30.923004   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:30.923350   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:31.423073   49088 type.go:168] "Request Body" body=""
	I1202 19:11:31.423147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:31.423507   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:31.922953   49088 type.go:168] "Request Body" body=""
	I1202 19:11:31.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:31.923337   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:32.422971   49088 type.go:168] "Request Body" body=""
	I1202 19:11:32.423046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:32.423366   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:32.423417   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:32.923034   49088 type.go:168] "Request Body" body=""
	I1202 19:11:32.923109   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:32.923438   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:33.423135   49088 type.go:168] "Request Body" body=""
	I1202 19:11:33.423204   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:33.423464   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:33.922974   49088 type.go:168] "Request Body" body=""
	I1202 19:11:33.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:33.923387   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:34.423091   49088 type.go:168] "Request Body" body=""
	I1202 19:11:34.423168   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:34.423523   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:34.423580   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:34.923239   49088 type.go:168] "Request Body" body=""
	I1202 19:11:34.923307   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:34.923573   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:35.423667   49088 type.go:168] "Request Body" body=""
	I1202 19:11:35.423743   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:35.424088   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:35.923861   49088 type.go:168] "Request Body" body=""
	I1202 19:11:35.923938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:35.924296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:36.422936   49088 type.go:168] "Request Body" body=""
	I1202 19:11:36.423001   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:36.423257   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:36.922952   49088 type.go:168] "Request Body" body=""
	I1202 19:11:36.923029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:36.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:36.923429   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:37.422938   49088 type.go:168] "Request Body" body=""
	I1202 19:11:37.423016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:37.423347   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:37.923043   49088 type.go:168] "Request Body" body=""
	I1202 19:11:37.923123   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:37.923400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:38.422941   49088 type.go:168] "Request Body" body=""
	I1202 19:11:38.423011   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:38.423321   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:38.922959   49088 type.go:168] "Request Body" body=""
	I1202 19:11:38.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:38.923352   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:39.422867   49088 type.go:168] "Request Body" body=""
	I1202 19:11:39.422947   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:39.423239   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:39.423286   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:39.922967   49088 type.go:168] "Request Body" body=""
	I1202 19:11:39.923043   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:39.923373   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:40.422980   49088 type.go:168] "Request Body" body=""
	I1202 19:11:40.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:40.423412   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:40.923647   49088 type.go:168] "Request Body" body=""
	I1202 19:11:40.923717   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:40.923978   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:41.423742   49088 type.go:168] "Request Body" body=""
	I1202 19:11:41.423821   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:41.424168   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:41.424221   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:41.922872   49088 type.go:168] "Request Body" body=""
	I1202 19:11:41.922945   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:41.923310   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:42.423001   49088 type.go:168] "Request Body" body=""
	I1202 19:11:42.423082   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:42.423364   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:42.922973   49088 type.go:168] "Request Body" body=""
	I1202 19:11:42.923054   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:42.923395   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:43.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:11:43.423052   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:43.423368   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:43.922915   49088 type.go:168] "Request Body" body=""
	I1202 19:11:43.922986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:43.923252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:43.923292   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:44.422977   49088 type.go:168] "Request Body" body=""
	I1202 19:11:44.423058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:44.423399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:44.923181   49088 type.go:168] "Request Body" body=""
	I1202 19:11:44.923261   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:44.923585   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:45.423566   49088 type.go:168] "Request Body" body=""
	I1202 19:11:45.423644   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:45.423912   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:45.923624   49088 type.go:168] "Request Body" body=""
	I1202 19:11:45.923703   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:45.924023   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:45.924071   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:46.423663   49088 type.go:168] "Request Body" body=""
	I1202 19:11:46.423734   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:46.424056   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:46.923091   49088 type.go:168] "Request Body" body=""
	I1202 19:11:46.923157   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:46.923456   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:47.423153   49088 type.go:168] "Request Body" body=""
	I1202 19:11:47.423230   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:47.423569   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:47.923278   49088 type.go:168] "Request Body" body=""
	I1202 19:11:47.923362   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:47.923689   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:48.422896   49088 type.go:168] "Request Body" body=""
	I1202 19:11:48.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:48.423231   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:48.423281   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:48.922955   49088 type.go:168] "Request Body" body=""
	I1202 19:11:48.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:48.923392   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:49.423094   49088 type.go:168] "Request Body" body=""
	I1202 19:11:49.423172   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:49.423529   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:49.923206   49088 type.go:168] "Request Body" body=""
	I1202 19:11:49.923275   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:49.923532   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:50.423633   49088 type.go:168] "Request Body" body=""
	I1202 19:11:50.423711   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:50.424022   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:50.424076   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:50.923805   49088 type.go:168] "Request Body" body=""
	I1202 19:11:50.923878   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:50.924219   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:51.423760   49088 type.go:168] "Request Body" body=""
	I1202 19:11:51.423833   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:51.424113   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:51.923870   49088 type.go:168] "Request Body" body=""
	I1202 19:11:51.923950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:51.924286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:52.422868   49088 type.go:168] "Request Body" body=""
	I1202 19:11:52.422952   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:52.423285   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:52.923892   49088 type.go:168] "Request Body" body=""
	I1202 19:11:52.923962   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:52.924248   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:52.924291   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:53.422923   49088 type.go:168] "Request Body" body=""
	I1202 19:11:53.423002   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:53.423357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:53.922965   49088 type.go:168] "Request Body" body=""
	I1202 19:11:53.923041   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:53.923384   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:54.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:11:54.422991   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:54.423296   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:54.922927   49088 type.go:168] "Request Body" body=""
	I1202 19:11:54.923011   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:54.923324   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:55.423007   49088 type.go:168] "Request Body" body=""
	I1202 19:11:55.423084   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:55.423406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:55.423459   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:55.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:11:55.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:55.923318   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:56.423006   49088 type.go:168] "Request Body" body=""
	I1202 19:11:56.423078   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:56.423413   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:56.923138   49088 type.go:168] "Request Body" body=""
	I1202 19:11:56.923215   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:56.923566   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:57.423251   49088 type.go:168] "Request Body" body=""
	I1202 19:11:57.423331   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:57.423624   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:57.423682   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:57.923555   49088 type.go:168] "Request Body" body=""
	I1202 19:11:57.923629   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:57.923962   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:58.423744   49088 type.go:168] "Request Body" body=""
	I1202 19:11:58.423824   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:58.424157   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:58.923666   49088 type.go:168] "Request Body" body=""
	I1202 19:11:58.923737   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:58.924002   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:11:59.423779   49088 type.go:168] "Request Body" body=""
	I1202 19:11:59.423856   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:59.424204   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:11:59.424262   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:11:59.922940   49088 type.go:168] "Request Body" body=""
	I1202 19:11:59.923028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:11:59.923350   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:00.422961   49088 type.go:168] "Request Body" body=""
	I1202 19:12:00.423042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:00.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:00.923010   49088 type.go:168] "Request Body" body=""
	I1202 19:12:00.923091   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:00.923432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:01.423015   49088 type.go:168] "Request Body" body=""
	I1202 19:12:01.423098   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:01.423440   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:01.923901   49088 type.go:168] "Request Body" body=""
	I1202 19:12:01.923978   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:01.924301   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:01.924373   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:02.423047   49088 type.go:168] "Request Body" body=""
	I1202 19:12:02.423120   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:02.423479   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:02.923197   49088 type.go:168] "Request Body" body=""
	I1202 19:12:02.923275   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:02.923599   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:03.422914   49088 type.go:168] "Request Body" body=""
	I1202 19:12:03.422989   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:03.423273   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:03.922947   49088 type.go:168] "Request Body" body=""
	I1202 19:12:03.923023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:03.923352   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:04.422979   49088 type.go:168] "Request Body" body=""
	I1202 19:12:04.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:04.423399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:04.423455   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:04.923130   49088 type.go:168] "Request Body" body=""
	I1202 19:12:04.923196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:04.923453   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:05.423379   49088 type.go:168] "Request Body" body=""
	I1202 19:12:05.423457   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:05.423797   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:05.923568   49088 type.go:168] "Request Body" body=""
	I1202 19:12:05.923643   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:05.923966   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:06.423488   49088 type.go:168] "Request Body" body=""
	I1202 19:12:06.423556   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:06.423829   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:06.423875   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:06.923674   49088 type.go:168] "Request Body" body=""
	I1202 19:12:06.923754   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:06.924076   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:07.423881   49088 type.go:168] "Request Body" body=""
	I1202 19:12:07.423954   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:07.424301   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:07.923933   49088 type.go:168] "Request Body" body=""
	I1202 19:12:07.924013   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:07.924280   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:08.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:12:08.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:08.423411   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:08.923117   49088 type.go:168] "Request Body" body=""
	I1202 19:12:08.923190   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:08.923511   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:08.923573   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:09.422927   49088 type.go:168] "Request Body" body=""
	I1202 19:12:09.422996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:09.423311   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:09.923024   49088 type.go:168] "Request Body" body=""
	I1202 19:12:09.923100   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:09.923439   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:10.422995   49088 type.go:168] "Request Body" body=""
	I1202 19:12:10.423070   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:10.423406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:10.922922   49088 type.go:168] "Request Body" body=""
	I1202 19:12:10.922997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:10.923336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:11.422925   49088 type.go:168] "Request Body" body=""
	I1202 19:12:11.423002   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:11.423341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:11.423398   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:11.923073   49088 type.go:168] "Request Body" body=""
	I1202 19:12:11.923147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:11.923465   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:12.422920   49088 type.go:168] "Request Body" body=""
	I1202 19:12:12.422996   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:12.423266   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:12.922985   49088 type.go:168] "Request Body" body=""
	I1202 19:12:12.923057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:12.923408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:13.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:12:13.423065   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:13.423401   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:13.423447   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:13.923740   49088 type.go:168] "Request Body" body=""
	I1202 19:12:13.923807   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:13.924068   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:14.423836   49088 type.go:168] "Request Body" body=""
	I1202 19:12:14.423917   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:14.424266   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:14.922896   49088 type.go:168] "Request Body" body=""
	I1202 19:12:14.922969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:14.923318   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:15.423109   49088 type.go:168] "Request Body" body=""
	I1202 19:12:15.423181   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:15.423435   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:15.423486   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:15.922969   49088 type.go:168] "Request Body" body=""
	I1202 19:12:15.923045   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:15.923399   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:16.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:12:16.423056   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:16.423395   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:16.922950   49088 type.go:168] "Request Body" body=""
	I1202 19:12:16.923033   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:16.923357   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:17.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:12:17.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:17.423403   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:17.923097   49088 type.go:168] "Request Body" body=""
	I1202 19:12:17.923175   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:17.923503   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:17.923560   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:18.423204   49088 type.go:168] "Request Body" body=""
	I1202 19:12:18.423269   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:18.423531   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:18.922961   49088 type.go:168] "Request Body" body=""
	I1202 19:12:18.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:18.923383   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:19.422976   49088 type.go:168] "Request Body" body=""
	I1202 19:12:19.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:19.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:19.923083   49088 type.go:168] "Request Body" body=""
	I1202 19:12:19.923154   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:19.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:20.423485   49088 type.go:168] "Request Body" body=""
	I1202 19:12:20.423567   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:20.423913   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:20.423967   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:20.923722   49088 type.go:168] "Request Body" body=""
	I1202 19:12:20.923795   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:20.924168   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:21.422933   49088 type.go:168] "Request Body" body=""
	I1202 19:12:21.422998   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:21.423294   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:21.922979   49088 type.go:168] "Request Body" body=""
	I1202 19:12:21.923058   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:21.923391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:22.422989   49088 type.go:168] "Request Body" body=""
	I1202 19:12:22.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:22.423414   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:22.923563   49088 type.go:168] "Request Body" body=""
	I1202 19:12:22.923637   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:22.923893   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:22.923932   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:23.423651   49088 type.go:168] "Request Body" body=""
	I1202 19:12:23.423730   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:23.424077   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:23.923725   49088 type.go:168] "Request Body" body=""
	I1202 19:12:23.923795   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:23.924130   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:24.423644   49088 type.go:168] "Request Body" body=""
	I1202 19:12:24.423724   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:24.424004   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:24.923060   49088 type.go:168] "Request Body" body=""
	I1202 19:12:24.923142   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:24.923487   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:25.423281   49088 type.go:168] "Request Body" body=""
	I1202 19:12:25.423364   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:25.423721   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:25.423779   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:25.923482   49088 type.go:168] "Request Body" body=""
	I1202 19:12:25.923550   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:25.923808   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:26.423542   49088 type.go:168] "Request Body" body=""
	I1202 19:12:26.423613   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:26.423935   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:26.923724   49088 type.go:168] "Request Body" body=""
	I1202 19:12:26.923807   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:26.924187   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:27.422900   49088 type.go:168] "Request Body" body=""
	I1202 19:12:27.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:27.423252   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:27.922940   49088 type.go:168] "Request Body" body=""
	I1202 19:12:27.923019   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:27.923343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:27.923403   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:28.422986   49088 type.go:168] "Request Body" body=""
	I1202 19:12:28.423061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:28.423432   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:28.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:12:28.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:28.923349   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:29.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:12:29.423092   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:29.423421   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:29.923070   49088 type.go:168] "Request Body" body=""
	I1202 19:12:29.923147   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:29.923486   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:29.923558   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:30.423578   49088 type.go:168] "Request Body" body=""
	I1202 19:12:30.423659   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:30.423950   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:30.923208   49088 type.go:168] "Request Body" body=""
	I1202 19:12:30.923281   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:30.923639   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:31.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:12:31.423059   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:31.423398   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:31.922934   49088 type.go:168] "Request Body" body=""
	I1202 19:12:31.923000   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:31.923257   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:32.422953   49088 type.go:168] "Request Body" body=""
	I1202 19:12:32.423028   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:32.423354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:32.423413   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:32.923085   49088 type.go:168] "Request Body" body=""
	I1202 19:12:32.923168   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:32.923564   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:33.423262   49088 type.go:168] "Request Body" body=""
	I1202 19:12:33.423340   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:33.423598   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:33.922973   49088 type.go:168] "Request Body" body=""
	I1202 19:12:33.923049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:33.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:34.422949   49088 type.go:168] "Request Body" body=""
	I1202 19:12:34.423022   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:34.423336   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:34.922878   49088 type.go:168] "Request Body" body=""
	I1202 19:12:34.922943   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:34.923200   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:34.923239   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:35.423165   49088 type.go:168] "Request Body" body=""
	I1202 19:12:35.423238   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:35.423568   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:35.923256   49088 type.go:168] "Request Body" body=""
	I1202 19:12:35.923329   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:35.923637   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:36.422898   49088 type.go:168] "Request Body" body=""
	I1202 19:12:36.422965   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:36.423218   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:36.922956   49088 type.go:168] "Request Body" body=""
	I1202 19:12:36.923035   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:36.923355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:36.923409   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:37.423086   49088 type.go:168] "Request Body" body=""
	I1202 19:12:37.423161   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:37.423449   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:37.923834   49088 type.go:168] "Request Body" body=""
	I1202 19:12:37.923903   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:37.924159   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:38.422906   49088 type.go:168] "Request Body" body=""
	I1202 19:12:38.422977   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:38.423261   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:38.922970   49088 type.go:168] "Request Body" body=""
	I1202 19:12:38.923046   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:38.923389   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:38.923440   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:39.423714   49088 type.go:168] "Request Body" body=""
	I1202 19:12:39.423788   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:39.424096   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:39.923879   49088 type.go:168] "Request Body" body=""
	I1202 19:12:39.923950   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:39.924267   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:40.423003   49088 type.go:168] "Request Body" body=""
	I1202 19:12:40.423081   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:40.423385   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:40.923058   49088 type.go:168] "Request Body" body=""
	I1202 19:12:40.923129   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:40.923426   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:40.923486   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:41.422953   49088 type.go:168] "Request Body" body=""
	I1202 19:12:41.423050   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:41.423343   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:41.923078   49088 type.go:168] "Request Body" body=""
	I1202 19:12:41.923156   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:41.923473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:42.423139   49088 type.go:168] "Request Body" body=""
	I1202 19:12:42.423204   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:42.423502   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:42.923011   49088 type.go:168] "Request Body" body=""
	I1202 19:12:42.923090   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:42.923406   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:43.423000   49088 type.go:168] "Request Body" body=""
	I1202 19:12:43.423079   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:43.423400   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:43.423446   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:43.923623   49088 type.go:168] "Request Body" body=""
	I1202 19:12:43.923696   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:43.923958   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:44.423707   49088 type.go:168] "Request Body" body=""
	I1202 19:12:44.423805   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:44.424192   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:44.923041   49088 type.go:168] "Request Body" body=""
	I1202 19:12:44.923114   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:44.923460   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:45.423487   49088 type.go:168] "Request Body" body=""
	I1202 19:12:45.423555   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:45.423816   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:45.423856   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:45.923602   49088 type.go:168] "Request Body" body=""
	I1202 19:12:45.923678   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:45.924003   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:46.423772   49088 type.go:168] "Request Body" body=""
	I1202 19:12:46.423843   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:46.424193   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:46.923702   49088 type.go:168] "Request Body" body=""
	I1202 19:12:46.923775   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:46.924028   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:47.423742   49088 type.go:168] "Request Body" body=""
	I1202 19:12:47.423811   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:47.424126   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:47.424184   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:47.923682   49088 type.go:168] "Request Body" body=""
	I1202 19:12:47.923759   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:47.924070   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:48.423650   49088 type.go:168] "Request Body" body=""
	I1202 19:12:48.423719   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:48.423981   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:48.923704   49088 type.go:168] "Request Body" body=""
	I1202 19:12:48.923774   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:48.924090   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:49.423900   49088 type.go:168] "Request Body" body=""
	I1202 19:12:49.423983   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:49.424354   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:49.424408   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:49.922950   49088 type.go:168] "Request Body" body=""
	I1202 19:12:49.923023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:49.923365   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:50.423243   49088 type.go:168] "Request Body" body=""
	I1202 19:12:50.423322   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:50.423653   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:50.922964   49088 type.go:168] "Request Body" body=""
	I1202 19:12:50.923042   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:50.923377   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:51.422954   49088 type.go:168] "Request Body" body=""
	I1202 19:12:51.423023   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:51.423346   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:51.923047   49088 type.go:168] "Request Body" body=""
	I1202 19:12:51.923126   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:51.923460   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:51.923513   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:52.422937   49088 type.go:168] "Request Body" body=""
	I1202 19:12:52.423014   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:52.423303   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:52.922929   49088 type.go:168] "Request Body" body=""
	I1202 19:12:52.923016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:52.923375   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:53.423008   49088 type.go:168] "Request Body" body=""
	I1202 19:12:53.423100   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:53.423482   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:53.922987   49088 type.go:168] "Request Body" body=""
	I1202 19:12:53.923061   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:53.923413   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:54.422909   49088 type.go:168] "Request Body" body=""
	I1202 19:12:54.422983   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:54.423246   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:54.423296   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:54.923072   49088 type.go:168] "Request Body" body=""
	I1202 19:12:54.923153   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:54.923523   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:55.423518   49088 type.go:168] "Request Body" body=""
	I1202 19:12:55.423603   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:55.423968   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:55.923602   49088 type.go:168] "Request Body" body=""
	I1202 19:12:55.923672   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:55.924000   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:56.423806   49088 type.go:168] "Request Body" body=""
	I1202 19:12:56.423881   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:56.424245   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:56.424298   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:56.923893   49088 type.go:168] "Request Body" body=""
	I1202 19:12:56.923967   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:56.924355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:57.422930   49088 type.go:168] "Request Body" body=""
	I1202 19:12:57.422997   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:57.423287   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:57.922958   49088 type.go:168] "Request Body" body=""
	I1202 19:12:57.923030   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:57.923341   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:58.423065   49088 type.go:168] "Request Body" body=""
	I1202 19:12:58.423137   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:58.423498   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:58.923185   49088 type.go:168] "Request Body" body=""
	I1202 19:12:58.923255   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:58.923518   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:12:58.923557   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:12:59.422988   49088 type.go:168] "Request Body" body=""
	I1202 19:12:59.423062   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:59.423415   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:12:59.923116   49088 type.go:168] "Request Body" body=""
	I1202 19:12:59.923196   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:12:59.923537   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:00.422981   49088 type.go:168] "Request Body" body=""
	I1202 19:13:00.423055   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:00.423421   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:00.922957   49088 type.go:168] "Request Body" body=""
	I1202 19:13:00.923031   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:00.923358   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:01.423025   49088 type.go:168] "Request Body" body=""
	I1202 19:13:01.423098   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:01.423429   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:01.423485   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:01.923135   49088 type.go:168] "Request Body" body=""
	I1202 19:13:01.923210   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:01.923470   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:02.422970   49088 type.go:168] "Request Body" body=""
	I1202 19:13:02.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:02.423386   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:02.923090   49088 type.go:168] "Request Body" body=""
	I1202 19:13:02.923194   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:02.923514   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:03.423202   49088 type.go:168] "Request Body" body=""
	I1202 19:13:03.423271   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:03.423580   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:03.423632   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:03.922949   49088 type.go:168] "Request Body" body=""
	I1202 19:13:03.923051   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:03.923344   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:04.422975   49088 type.go:168] "Request Body" body=""
	I1202 19:13:04.423049   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:04.423409   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:04.923130   49088 type.go:168] "Request Body" body=""
	I1202 19:13:04.923198   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:04.923485   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:05.423469   49088 type.go:168] "Request Body" body=""
	I1202 19:13:05.423540   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:05.423887   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:05.423943   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:05.923727   49088 type.go:168] "Request Body" body=""
	I1202 19:13:05.923801   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:05.924115   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:06.423862   49088 type.go:168] "Request Body" body=""
	I1202 19:13:06.423938   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:06.424199   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:06.922912   49088 type.go:168] "Request Body" body=""
	I1202 19:13:06.922984   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:06.923325   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:07.422978   49088 type.go:168] "Request Body" body=""
	I1202 19:13:07.423053   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:07.423373   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:07.922921   49088 type.go:168] "Request Body" body=""
	I1202 19:13:07.922995   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:07.923258   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:07.923298   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:08.422983   49088 type.go:168] "Request Body" body=""
	I1202 19:13:08.423057   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:08.423391   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:08.922933   49088 type.go:168] "Request Body" body=""
	I1202 19:13:08.923006   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:08.923329   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:09.422864   49088 type.go:168] "Request Body" body=""
	I1202 19:13:09.422931   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:09.423213   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:09.922936   49088 type.go:168] "Request Body" body=""
	I1202 19:13:09.923016   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:09.923330   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:09.923387   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:10.422968   49088 type.go:168] "Request Body" body=""
	I1202 19:13:10.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:10.423382   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:10.922900   49088 type.go:168] "Request Body" body=""
	I1202 19:13:10.922972   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:10.923227   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:11.422967   49088 type.go:168] "Request Body" body=""
	I1202 19:13:11.423047   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:11.423372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:11.922966   49088 type.go:168] "Request Body" body=""
	I1202 19:13:11.923038   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:11.923347   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:12.422911   49088 type.go:168] "Request Body" body=""
	I1202 19:13:12.422981   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:12.423291   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:12.423344   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:12.922990   49088 type.go:168] "Request Body" body=""
	I1202 19:13:12.923081   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:12.923443   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:13.423156   49088 type.go:168] "Request Body" body=""
	I1202 19:13:13.423234   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:13.423549   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:13.922900   49088 type.go:168] "Request Body" body=""
	I1202 19:13:13.922969   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:13.923235   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:14.422957   49088 type.go:168] "Request Body" body=""
	I1202 19:13:14.423029   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:14.423396   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:14.423449   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:14.923039   49088 type.go:168] "Request Body" body=""
	I1202 19:13:14.923111   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:14.923397   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:15.423223   49088 type.go:168] "Request Body" body=""
	I1202 19:13:15.423302   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:15.423557   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:15.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:13:15.923034   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:15.923367   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:16.423062   49088 type.go:168] "Request Body" body=""
	I1202 19:13:16.423140   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:16.423473   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:16.423529   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:16.923839   49088 type.go:168] "Request Body" body=""
	I1202 19:13:16.923917   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:16.924188   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:17.422894   49088 type.go:168] "Request Body" body=""
	I1202 19:13:17.422973   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:17.423308   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:17.922909   49088 type.go:168] "Request Body" body=""
	I1202 19:13:17.922985   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:17.923348   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:18.423042   49088 type.go:168] "Request Body" body=""
	I1202 19:13:18.423112   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:18.423378   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:18.923054   49088 type.go:168] "Request Body" body=""
	I1202 19:13:18.923129   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:18.923451   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:18.923507   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:19.422987   49088 type.go:168] "Request Body" body=""
	I1202 19:13:19.423063   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:19.423405   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:19.923563   49088 type.go:168] "Request Body" body=""
	I1202 19:13:19.923634   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:19.923942   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:20.423766   49088 type.go:168] "Request Body" body=""
	I1202 19:13:20.423836   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:20.424183   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:20.922889   49088 type.go:168] "Request Body" body=""
	I1202 19:13:20.922963   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:20.923286   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:21.422910   49088 type.go:168] "Request Body" body=""
	I1202 19:13:21.422986   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:21.423265   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:21.423304   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:21.922960   49088 type.go:168] "Request Body" body=""
	I1202 19:13:21.923032   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:21.923372   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:22.422982   49088 type.go:168] "Request Body" body=""
	I1202 19:13:22.423066   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:22.423408   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:22.923660   49088 type.go:168] "Request Body" body=""
	I1202 19:13:22.923726   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:22.923989   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:23.423852   49088 type.go:168] "Request Body" body=""
	I1202 19:13:23.423930   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:23.424355   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	W1202 19:13:23.424410   49088 node_ready.go:55] error getting node "functional-449836" condition "Ready" status (will retry): Get "https://192.168.49.2:8441/api/v1/nodes/functional-449836": dial tcp 192.168.49.2:8441: connect: connection refused
	I1202 19:13:23.922972   49088 type.go:168] "Request Body" body=""
	I1202 19:13:23.923088   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:23.923457   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:24.423150   49088 type.go:168] "Request Body" body=""
	I1202 19:13:24.423233   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:24.423566   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:24.923507   49088 type.go:168] "Request Body" body=""
	I1202 19:13:24.923591   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:24.923948   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:25.423717   49088 type.go:168] "Request Body" body=""
	I1202 19:13:25.423797   49088 round_trippers.go:527] "Request" verb="GET" url="https://192.168.49.2:8441/api/v1/nodes/functional-449836" headers=<
		Accept: application/vnd.kubernetes.protobuf,application/json
		User-Agent: minikube-linux-arm64/v0.0.0 (linux/arm64) kubernetes/$Format
	 >
	I1202 19:13:25.424161   49088 round_trippers.go:632] "Response" status="" headers="" milliseconds=0
	I1202 19:13:25.922841   49088 node_ready.go:38] duration metric: took 6m0.000085627s for node "functional-449836" to be "Ready" ...
	I1202 19:13:25.925875   49088 out.go:203] 
	W1202 19:13:25.928738   49088 out.go:285] X Exiting due to GUEST_START: failed to start node: wait 6m0s for node: waiting for node to be ready: WaitNodeCondition: context deadline exceeded
	W1202 19:13:25.928760   49088 out.go:285] * 
	W1202 19:13:25.930899   49088 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 19:13:25.934748   49088 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:13:33 functional-449836 containerd[5842]: time="2025-12-02T19:13:33.645009599Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.3\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:34 functional-449836 containerd[5842]: time="2025-12-02T19:13:34.681117821Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 02 19:13:34 functional-449836 containerd[5842]: time="2025-12-02T19:13:34.683398192Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 02 19:13:34 functional-449836 containerd[5842]: time="2025-12-02T19:13:34.693674136Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:34 functional-449836 containerd[5842]: time="2025-12-02T19:13:34.694210924Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:35 functional-449836 containerd[5842]: time="2025-12-02T19:13:35.662714060Z" level=info msg="No images store for sha256:2eaa477b07fa94239065ddfa3c63972bc774ee1ebce5861cf639d04e0692e711"
	Dec 02 19:13:35 functional-449836 containerd[5842]: time="2025-12-02T19:13:35.665361216Z" level=info msg="ImageCreate event name:\"docker.io/library/minikube-local-cache-test:functional-449836\""
	Dec 02 19:13:35 functional-449836 containerd[5842]: time="2025-12-02T19:13:35.678350434Z" level=info msg="ImageCreate event name:\"sha256:9d59d8178e3a4c209f1c923737212d2bc54133c85455f4f8e051d069a9d30853\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:35 functional-449836 containerd[5842]: time="2025-12-02T19:13:35.678839993Z" level=info msg="ImageUpdate event name:\"docker.io/library/minikube-local-cache-test:functional-449836\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:36 functional-449836 containerd[5842]: time="2025-12-02T19:13:36.465220542Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\""
	Dec 02 19:13:36 functional-449836 containerd[5842]: time="2025-12-02T19:13:36.468232761Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:latest\""
	Dec 02 19:13:36 functional-449836 containerd[5842]: time="2025-12-02T19:13:36.470595363Z" level=info msg="ImageDelete event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\""
	Dec 02 19:13:36 functional-449836 containerd[5842]: time="2025-12-02T19:13:36.483876889Z" level=info msg="RemoveImage \"registry.k8s.io/pause:latest\" returns successfully"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.376758474Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\""
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.379358582Z" level=info msg="ImageDelete event name:\"registry.k8s.io/pause:3.1\""
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.382369874Z" level=info msg="ImageDelete event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\""
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.389733563Z" level=info msg="RemoveImage \"registry.k8s.io/pause:3.1\" returns successfully"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.509877470Z" level=info msg="No images store for sha256:3ac89611d5efd8eb74174b1f04c33b7e73b651cec35b5498caf0cfdd2efd7d48"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.512059256Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.1\""
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.519058867Z" level=info msg="ImageCreate event name:\"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.519608110Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.1\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.678879163Z" level=info msg="No images store for sha256:a1f83055284ec302ac691d8677946d8b4e772fb7071d39ada1cc9184cb70814b"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.681070623Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:latest\""
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.692574195Z" level=info msg="ImageCreate event name:\"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:13:37 functional-449836 containerd[5842]: time="2025-12-02T19:13:37.692993650Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:latest\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:13:41.700659    9922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:41.701268    9922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:41.702823    9922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:41.703245    9922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:13:41.704782    9922 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:13:41 up 55 min,  0 user,  load average: 0.50, 0.36, 0.52
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:13:38 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:39 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 828.
	Dec 02 19:13:39 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:39 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:39 functional-449836 kubelet[9788]: E1202 19:13:39.491201    9788 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:39 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:39 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:40 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 829.
	Dec 02 19:13:40 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:40 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:40 functional-449836 kubelet[9803]: E1202 19:13:40.236696    9803 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:40 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:40 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:40 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 830.
	Dec 02 19:13:40 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:40 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:40 functional-449836 kubelet[9838]: E1202 19:13:40.993325    9838 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:40 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:40 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:13:41 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 831.
	Dec 02 19:13:41 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:41 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:13:41 functional-449836 kubelet[9926]: E1202 19:13:41.732983    9926 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:13:41 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:13:41 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (413.528297ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/MinikubeKubectlCmdDirectly (2.33s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig (734.82s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-449836 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1202 19:15:46.066139    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:18:00.703908    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:19:23.772570    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:20:46.067975    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:23:00.703525    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:25:46.068057    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-449836 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: exit status 109 (12m12.715727744s)

                                                
                                                
-- stdout --
	* [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22021
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	* Pulling base image v0.0.48-1764169655-21974 ...
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	  - apiserver.enable-admission-plugins=NamespaceAutoProvision
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001197768s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Related issue: https://github.com/kubernetes/minikube/issues/4172

                                                
                                                
** /stderr **
functional_test.go:774: failed to restart minikube. args "out/minikube-linux-arm64 start -p functional-449836 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all": exit status 109
functional_test.go:776: restart took 12m12.716968819s for "functional-449836" cluster.
I1202 19:25:55.436956    4435 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (316.503976ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-224594 image ls --format yaml --alsologtostderr                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh     │ functional-224594 ssh pgrep buildkitd                                                                                                                   │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ image   │ functional-224594 image ls --format json --alsologtostderr                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image ls --format table --alsologtostderr                                                                                             │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image build -t localhost/my-image:functional-224594 testdata/build --alsologtostderr                                                  │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image ls                                                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ delete  │ -p functional-224594                                                                                                                                    │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ start   │ -p functional-449836 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ start   │ -p functional-449836 --alsologtostderr -v=8                                                                                                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:07 UTC │                     │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:latest                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add minikube-local-cache-test:functional-449836                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache delete minikube-local-cache-test:functional-449836                                                                              │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl images                                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	│ cache   │ functional-449836 cache reload                                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ kubectl │ functional-449836 kubectl -- --context functional-449836 get pods                                                                                       │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	│ start   │ -p functional-449836 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:13:42
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:13:42.762704   54807 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:13:42.762827   54807 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:13:42.762831   54807 out.go:374] Setting ErrFile to fd 2...
	I1202 19:13:42.762834   54807 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:13:42.763078   54807 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:13:42.763410   54807 out.go:368] Setting JSON to false
	I1202 19:13:42.764228   54807 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":3359,"bootTime":1764699464,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:13:42.764287   54807 start.go:143] virtualization:  
	I1202 19:13:42.767748   54807 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:13:42.771595   54807 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:13:42.771638   54807 notify.go:221] Checking for updates...
	I1202 19:13:42.777727   54807 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:13:42.780738   54807 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:13:42.783655   54807 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:13:42.786554   54807 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:13:42.789556   54807 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:13:42.793178   54807 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:13:42.793273   54807 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:13:42.817932   54807 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:13:42.818037   54807 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:13:42.893670   54807 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-02 19:13:42.884370868 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:13:42.893764   54807 docker.go:319] overlay module found
	I1202 19:13:42.896766   54807 out.go:179] * Using the docker driver based on existing profile
	I1202 19:13:42.899559   54807 start.go:309] selected driver: docker
	I1202 19:13:42.899567   54807 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:42.899671   54807 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:13:42.899770   54807 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:13:42.952802   54807 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-02 19:13:42.943962699 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:13:42.953225   54807 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1202 19:13:42.953247   54807 cni.go:84] Creating CNI manager for ""
	I1202 19:13:42.953303   54807 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:13:42.953342   54807 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:42.958183   54807 out.go:179] * Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	I1202 19:13:42.960983   54807 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 19:13:42.963884   54807 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 19:13:42.968058   54807 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:13:42.968252   54807 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 19:13:42.989666   54807 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 19:13:42.989677   54807 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 19:13:43.031045   54807 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 19:13:43.240107   54807 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 19:13:43.240267   54807 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 19:13:43.240445   54807 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240540   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 19:13:43.240557   54807 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 118.031µs
	I1202 19:13:43.240570   54807 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 19:13:43.240584   54807 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240616   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 19:13:43.240621   54807 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 38.835µs
	I1202 19:13:43.240626   54807 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 19:13:43.240809   54807 cache.go:243] Successfully downloaded all kic artifacts
	I1202 19:13:43.240835   54807 start.go:360] acquireMachinesLock for functional-449836: {Name:mk8999fdfa518fc15358d07431fe9bec286a035e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240864   54807 start.go:364] duration metric: took 20.397µs to acquireMachinesLock for "functional-449836"
	I1202 19:13:43.240875   54807 start.go:96] Skipping create...Using existing machine configuration
	I1202 19:13:43.240879   54807 fix.go:54] fixHost starting: 
	I1202 19:13:43.241152   54807 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:13:43.241336   54807 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241393   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 19:13:43.241400   54807 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 69.973µs
	I1202 19:13:43.241406   54807 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 19:13:43.241456   54807 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241496   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 19:13:43.241501   54807 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 46.589µs
	I1202 19:13:43.241506   54807 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 19:13:43.241515   54807 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241539   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 19:13:43.241543   54807 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 29.662µs
	I1202 19:13:43.241548   54807 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 19:13:43.241556   54807 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241581   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 19:13:43.241585   54807 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 29.85µs
	I1202 19:13:43.241589   54807 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 19:13:43.241615   54807 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241641   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 19:13:43.241629   54807 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241645   54807 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 32.345µs
	I1202 19:13:43.241650   54807 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 19:13:43.241693   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 19:13:43.241700   54807 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 86.392µs
	I1202 19:13:43.241706   54807 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 19:13:43.241720   54807 cache.go:87] Successfully saved all images to host disk.
	I1202 19:13:43.258350   54807 fix.go:112] recreateIfNeeded on functional-449836: state=Running err=<nil>
	W1202 19:13:43.258376   54807 fix.go:138] unexpected machine state, will restart: <nil>
	I1202 19:13:43.261600   54807 out.go:252] * Updating the running docker "functional-449836" container ...
	I1202 19:13:43.261627   54807 machine.go:94] provisionDockerMachine start ...
	I1202 19:13:43.261705   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.278805   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.279129   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.279134   54807 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 19:13:43.427938   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:13:43.427951   54807 ubuntu.go:182] provisioning hostname "functional-449836"
	I1202 19:13:43.428028   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.447456   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.447752   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.447759   54807 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-449836 && echo "functional-449836" | sudo tee /etc/hostname
	I1202 19:13:43.605729   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:13:43.605800   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.624976   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.625283   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.625296   54807 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-449836' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-449836/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-449836' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 19:13:43.772540   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 19:13:43.772562   54807 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 19:13:43.772595   54807 ubuntu.go:190] setting up certificates
	I1202 19:13:43.772604   54807 provision.go:84] configureAuth start
	I1202 19:13:43.772671   54807 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:13:43.790248   54807 provision.go:143] copyHostCerts
	I1202 19:13:43.790316   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 19:13:43.790328   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:13:43.790400   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 19:13:43.790504   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 19:13:43.790515   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:13:43.790538   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 19:13:43.790586   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 19:13:43.790589   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:13:43.790610   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 19:13:43.790652   54807 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.functional-449836 san=[127.0.0.1 192.168.49.2 functional-449836 localhost minikube]
	I1202 19:13:43.836362   54807 provision.go:177] copyRemoteCerts
	I1202 19:13:43.836414   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 19:13:43.836453   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.856436   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:43.960942   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 19:13:43.990337   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1202 19:13:44.010316   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 19:13:44.028611   54807 provision.go:87] duration metric: took 255.971492ms to configureAuth
	I1202 19:13:44.028629   54807 ubuntu.go:206] setting minikube options for container-runtime
	I1202 19:13:44.028821   54807 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:13:44.028827   54807 machine.go:97] duration metric: took 767.195405ms to provisionDockerMachine
	I1202 19:13:44.028833   54807 start.go:293] postStartSetup for "functional-449836" (driver="docker")
	I1202 19:13:44.028844   54807 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 19:13:44.028890   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 19:13:44.028937   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.046629   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.156467   54807 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 19:13:44.159958   54807 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 19:13:44.159979   54807 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 19:13:44.159992   54807 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 19:13:44.160053   54807 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 19:13:44.160131   54807 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 19:13:44.160205   54807 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> hosts in /etc/test/nested/copy/4435
	I1202 19:13:44.160247   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4435
	I1202 19:13:44.167846   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:13:44.185707   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts --> /etc/test/nested/copy/4435/hosts (40 bytes)
	I1202 19:13:44.203573   54807 start.go:296] duration metric: took 174.725487ms for postStartSetup
	I1202 19:13:44.203665   54807 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:13:44.203703   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.221082   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.321354   54807 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 19:13:44.325951   54807 fix.go:56] duration metric: took 1.085065634s for fixHost
	I1202 19:13:44.325966   54807 start.go:83] releasing machines lock for "functional-449836", held for 1.08509619s
	I1202 19:13:44.326041   54807 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:13:44.343136   54807 ssh_runner.go:195] Run: cat /version.json
	I1202 19:13:44.343179   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.343439   54807 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 19:13:44.343497   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.361296   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.363895   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.464126   54807 ssh_runner.go:195] Run: systemctl --version
	I1202 19:13:44.557588   54807 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1202 19:13:44.561902   54807 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 19:13:44.561962   54807 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 19:13:44.569598   54807 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1202 19:13:44.569611   54807 start.go:496] detecting cgroup driver to use...
	I1202 19:13:44.569649   54807 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 19:13:44.569710   54807 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 19:13:44.587349   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 19:13:44.609174   54807 docker.go:218] disabling cri-docker service (if available) ...
	I1202 19:13:44.609228   54807 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 19:13:44.629149   54807 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 19:13:44.643983   54807 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 19:13:44.758878   54807 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 19:13:44.879635   54807 docker.go:234] disabling docker service ...
	I1202 19:13:44.879691   54807 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 19:13:44.895449   54807 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 19:13:44.908858   54807 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 19:13:45.045971   54807 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 19:13:45.189406   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 19:13:45.215003   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 19:13:45.239052   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 19:13:45.252425   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 19:13:45.264818   54807 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 19:13:45.264881   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 19:13:45.275398   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:13:45.286201   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 19:13:45.295830   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:13:45.307108   54807 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 19:13:45.315922   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 19:13:45.325735   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 19:13:45.336853   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 19:13:45.346391   54807 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 19:13:45.354212   54807 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 19:13:45.361966   54807 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:13:45.496442   54807 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 19:13:45.617692   54807 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 19:13:45.617755   54807 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 19:13:45.622143   54807 start.go:564] Will wait 60s for crictl version
	I1202 19:13:45.622212   54807 ssh_runner.go:195] Run: which crictl
	I1202 19:13:45.626172   54807 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 19:13:45.650746   54807 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 19:13:45.650812   54807 ssh_runner.go:195] Run: containerd --version
	I1202 19:13:45.670031   54807 ssh_runner.go:195] Run: containerd --version
	I1202 19:13:45.697284   54807 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 19:13:45.700249   54807 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 19:13:45.717142   54807 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1202 19:13:45.724151   54807 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1202 19:13:45.727141   54807 kubeadm.go:884] updating cluster {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 19:13:45.727279   54807 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:13:45.727346   54807 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 19:13:45.751767   54807 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 19:13:45.751786   54807 cache_images.go:86] Images are preloaded, skipping loading
	I1202 19:13:45.751792   54807 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1202 19:13:45.751903   54807 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-449836 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 19:13:45.751976   54807 ssh_runner.go:195] Run: sudo crictl info
	I1202 19:13:45.777030   54807 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1202 19:13:45.777052   54807 cni.go:84] Creating CNI manager for ""
	I1202 19:13:45.777060   54807 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:13:45.777073   54807 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 19:13:45.777095   54807 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-449836 NodeName:functional-449836 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 19:13:45.777203   54807 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-449836"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 19:13:45.777274   54807 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 19:13:45.785000   54807 binaries.go:51] Found k8s binaries, skipping transfer
	I1202 19:13:45.785061   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 19:13:45.792592   54807 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1202 19:13:45.805336   54807 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 19:13:45.818427   54807 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1202 19:13:45.830990   54807 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1202 19:13:45.834935   54807 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:13:45.945402   54807 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:13:46.172299   54807 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836 for IP: 192.168.49.2
	I1202 19:13:46.172311   54807 certs.go:195] generating shared ca certs ...
	I1202 19:13:46.172340   54807 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:13:46.172494   54807 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 19:13:46.172550   54807 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 19:13:46.172557   54807 certs.go:257] generating profile certs ...
	I1202 19:13:46.172651   54807 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key
	I1202 19:13:46.172725   54807 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da
	I1202 19:13:46.172770   54807 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key
	I1202 19:13:46.172876   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 19:13:46.172906   54807 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 19:13:46.172913   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 19:13:46.172944   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 19:13:46.172967   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 19:13:46.172992   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 19:13:46.173034   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:13:46.174236   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 19:13:46.206005   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 19:13:46.223256   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 19:13:46.250390   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 19:13:46.270550   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1202 19:13:46.289153   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 19:13:46.307175   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 19:13:46.325652   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 19:13:46.343823   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 19:13:46.361647   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 19:13:46.379597   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 19:13:46.397750   54807 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 19:13:46.411087   54807 ssh_runner.go:195] Run: openssl version
	I1202 19:13:46.418777   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 19:13:46.427262   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.431022   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.431093   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.473995   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 19:13:46.482092   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 19:13:46.490432   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.494266   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.494320   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.535125   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 19:13:46.543277   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 19:13:46.551769   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.555743   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.555797   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.597778   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 19:13:46.605874   54807 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:13:46.609733   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1202 19:13:46.652482   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1202 19:13:46.693214   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1202 19:13:46.734654   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1202 19:13:46.775729   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1202 19:13:46.821319   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1202 19:13:46.862299   54807 kubeadm.go:401] StartCluster: {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:46.862398   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 19:13:46.862468   54807 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:13:46.891099   54807 cri.go:89] found id: ""
	I1202 19:13:46.891159   54807 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 19:13:46.898813   54807 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1202 19:13:46.898821   54807 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1202 19:13:46.898874   54807 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1202 19:13:46.906272   54807 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:46.906775   54807 kubeconfig.go:125] found "functional-449836" server: "https://192.168.49.2:8441"
	I1202 19:13:46.908038   54807 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1202 19:13:46.915724   54807 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-02 18:59:11.521818114 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-02 19:13:45.826341203 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1202 19:13:46.915744   54807 kubeadm.go:1161] stopping kube-system containers ...
	I1202 19:13:46.915757   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1202 19:13:46.915816   54807 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:13:46.943936   54807 cri.go:89] found id: ""
	I1202 19:13:46.944009   54807 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1202 19:13:46.961843   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:13:46.971074   54807 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Dec  2 19:03 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec  2 19:03 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec  2 19:03 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec  2 19:03 /etc/kubernetes/scheduler.conf
	
	I1202 19:13:46.971137   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:13:46.979452   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:13:46.987399   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:46.987454   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:13:46.994869   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:13:47.002498   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:47.002560   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:13:47.010116   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:13:47.017891   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:47.017946   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:13:47.025383   54807 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 19:13:47.033423   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:47.076377   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.395417   54807 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.319015091s)
	I1202 19:13:48.395495   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.604942   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.668399   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.712382   54807 api_server.go:52] waiting for apiserver process to appear ...
	I1202 19:13:48.712452   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:49.212900   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:49.713354   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:50.213340   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:50.713260   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:51.212621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:51.713471   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:52.213212   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:52.712687   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:53.212572   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:53.713310   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:54.212640   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:54.712595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:55.213133   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:55.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:56.212595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:56.713443   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:57.213230   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:57.713055   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:58.213071   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:58.712680   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:59.213352   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:59.712654   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:00.213647   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:00.712569   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:01.212673   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:01.713030   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:02.212581   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:02.712631   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:03.213287   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:03.712572   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:04.213500   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:04.713557   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:05.213523   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:05.713480   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:06.212772   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:06.713553   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:07.213309   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:07.712616   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:08.212729   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:08.713403   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:09.212625   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:09.713385   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:10.212662   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:10.712619   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:11.213505   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:11.712640   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:12.213396   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:12.712571   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:13.212963   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:13.713403   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:14.213457   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:14.712620   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:15.213335   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:15.713379   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:16.212612   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:16.712624   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:17.212573   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:17.713394   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:18.213294   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:18.713516   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:19.213531   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:19.713309   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:20.212591   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:20.713575   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:21.213560   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:21.713513   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:22.213219   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:22.712620   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:23.213273   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:23.713477   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:24.213364   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:24.712581   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:25.212597   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:25.713554   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:26.213205   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:26.712517   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:27.213345   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:27.712602   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:28.212602   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:28.713533   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:29.213188   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:29.713102   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:30.212626   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:30.712732   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:31.212615   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:31.713473   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:32.212590   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:32.712645   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:33.213398   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:33.713081   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:34.213498   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:34.712625   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:35.213560   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:35.712634   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:36.213370   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:36.712576   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:37.213006   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:37.712656   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:38.212594   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:38.713448   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:39.213442   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:39.712577   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:40.212621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:40.713516   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:41.212756   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:41.712509   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:42.215715   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:42.712573   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:43.212604   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:43.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:44.213283   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:44.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:45.213407   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:45.712947   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:46.213239   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:46.712626   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:47.213260   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:47.713210   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:48.212639   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:48.713264   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:48.713347   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:48.742977   54807 cri.go:89] found id: ""
	I1202 19:14:48.742990   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.742997   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:48.743002   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:48.743061   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:48.767865   54807 cri.go:89] found id: ""
	I1202 19:14:48.767879   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.767886   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:48.767892   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:48.767949   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:48.792531   54807 cri.go:89] found id: ""
	I1202 19:14:48.792544   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.792560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:48.792566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:48.792624   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:48.821644   54807 cri.go:89] found id: ""
	I1202 19:14:48.821657   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.821665   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:48.821670   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:48.821729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:48.847227   54807 cri.go:89] found id: ""
	I1202 19:14:48.847246   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.847253   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:48.847258   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:48.847318   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:48.872064   54807 cri.go:89] found id: ""
	I1202 19:14:48.872084   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.872091   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:48.872097   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:48.872155   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:48.895905   54807 cri.go:89] found id: ""
	I1202 19:14:48.895919   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.895925   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:48.895933   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:48.895945   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:48.962492   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:48.954400   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.954981   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.956769   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.957381   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.959046   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:48.954400   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.954981   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.956769   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.957381   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.959046   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:48.962515   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:48.962526   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:49.026861   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:49.026881   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:49.059991   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:49.060006   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:49.119340   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:49.119357   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:51.632315   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:51.642501   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:51.642560   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:51.669041   54807 cri.go:89] found id: ""
	I1202 19:14:51.669054   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.669061   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:51.669086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:51.669150   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:51.698828   54807 cri.go:89] found id: ""
	I1202 19:14:51.698857   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.698864   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:51.698870   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:51.698939   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:51.739419   54807 cri.go:89] found id: ""
	I1202 19:14:51.739446   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.739454   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:51.739459   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:51.739532   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:51.764613   54807 cri.go:89] found id: ""
	I1202 19:14:51.764627   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.764633   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:51.764639   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:51.764698   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:51.790197   54807 cri.go:89] found id: ""
	I1202 19:14:51.790211   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.790217   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:51.790222   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:51.790281   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:51.824131   54807 cri.go:89] found id: ""
	I1202 19:14:51.824144   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.824151   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:51.824170   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:51.824228   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:51.848893   54807 cri.go:89] found id: ""
	I1202 19:14:51.848907   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.848914   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:51.848922   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:51.848932   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:51.877099   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:51.877114   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:51.933539   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:51.933560   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:51.944309   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:51.944346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:52.014156   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:52.005976   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.006753   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.008549   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.009096   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.010706   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:52.005976   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.006753   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.008549   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.009096   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.010706   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:52.014167   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:52.014178   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:54.578451   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:54.588802   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:54.588862   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:54.613620   54807 cri.go:89] found id: ""
	I1202 19:14:54.613633   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.613640   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:54.613646   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:54.613704   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:54.637471   54807 cri.go:89] found id: ""
	I1202 19:14:54.637486   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.637498   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:54.637503   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:54.637561   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:54.662053   54807 cri.go:89] found id: ""
	I1202 19:14:54.662066   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.662073   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:54.662079   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:54.662135   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:54.694901   54807 cri.go:89] found id: ""
	I1202 19:14:54.694916   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.694923   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:54.694928   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:54.694998   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:54.728487   54807 cri.go:89] found id: ""
	I1202 19:14:54.728500   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.728507   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:54.728512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:54.728569   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:54.756786   54807 cri.go:89] found id: ""
	I1202 19:14:54.756800   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.756806   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:54.756812   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:54.756868   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:54.782187   54807 cri.go:89] found id: ""
	I1202 19:14:54.782200   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.782212   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:54.782220   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:54.782231   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:54.846497   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:54.838849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.839509   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.840990   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.841320   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.842849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:54.838849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.839509   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.840990   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.841320   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.842849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:54.846510   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:54.846521   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:54.909600   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:54.909620   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:54.943132   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:54.943150   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:55.006561   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:55.006581   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:57.519164   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:57.529445   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:57.529506   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:57.554155   54807 cri.go:89] found id: ""
	I1202 19:14:57.554168   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.554176   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:57.554181   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:57.554240   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:57.579453   54807 cri.go:89] found id: ""
	I1202 19:14:57.579468   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.579474   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:57.579480   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:57.579537   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:57.608139   54807 cri.go:89] found id: ""
	I1202 19:14:57.608152   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.608160   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:57.608165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:57.608224   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:57.632309   54807 cri.go:89] found id: ""
	I1202 19:14:57.632360   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.632368   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:57.632374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:57.632434   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:57.657933   54807 cri.go:89] found id: ""
	I1202 19:14:57.657947   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.657954   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:57.657959   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:57.658019   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:57.698982   54807 cri.go:89] found id: ""
	I1202 19:14:57.698996   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.699002   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:57.699008   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:57.699105   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:57.738205   54807 cri.go:89] found id: ""
	I1202 19:14:57.738219   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.738226   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:57.738234   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:57.738245   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:57.802193   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:57.794304   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.794844   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796409   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796881   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.798306   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:57.794304   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.794844   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796409   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796881   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.798306   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:57.802204   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:57.802215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:57.865638   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:57.865657   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:57.900835   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:57.900850   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:57.958121   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:57.958139   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:00.502580   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:00.515602   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:00.515692   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:00.553262   54807 cri.go:89] found id: ""
	I1202 19:15:00.553290   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.553298   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:00.553304   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:00.553372   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:00.592663   54807 cri.go:89] found id: ""
	I1202 19:15:00.592678   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.592686   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:00.592691   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:00.592782   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:00.624403   54807 cri.go:89] found id: ""
	I1202 19:15:00.624423   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.624431   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:00.624438   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:00.624521   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:00.659265   54807 cri.go:89] found id: ""
	I1202 19:15:00.659280   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.659288   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:00.659294   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:00.659383   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:00.695489   54807 cri.go:89] found id: ""
	I1202 19:15:00.695508   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.695517   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:00.695523   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:00.695592   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:00.732577   54807 cri.go:89] found id: ""
	I1202 19:15:00.732592   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.732600   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:00.732607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:00.732696   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:00.767521   54807 cri.go:89] found id: ""
	I1202 19:15:00.767538   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.767546   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:00.767555   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:00.767566   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:00.829818   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:00.829837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:00.842792   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:00.842810   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:00.919161   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:00.909398   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.910484   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.912564   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.913381   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.915298   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:00.909398   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.910484   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.912564   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.913381   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.915298   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:00.919174   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:00.919193   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:00.985798   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:00.985819   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:03.521258   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:03.531745   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:03.531810   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:03.556245   54807 cri.go:89] found id: ""
	I1202 19:15:03.556258   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.556265   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:03.556271   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:03.556355   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:03.580774   54807 cri.go:89] found id: ""
	I1202 19:15:03.580787   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.580794   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:03.580799   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:03.580857   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:03.606247   54807 cri.go:89] found id: ""
	I1202 19:15:03.606261   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.606269   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:03.606274   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:03.606335   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:03.631169   54807 cri.go:89] found id: ""
	I1202 19:15:03.631182   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.631189   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:03.631195   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:03.631252   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:03.657089   54807 cri.go:89] found id: ""
	I1202 19:15:03.657111   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.657118   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:03.657124   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:03.657183   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:03.699997   54807 cri.go:89] found id: ""
	I1202 19:15:03.700010   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.700017   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:03.700023   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:03.700081   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:03.725717   54807 cri.go:89] found id: ""
	I1202 19:15:03.725731   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.725738   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:03.725746   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:03.725755   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:03.793907   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:03.793928   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:03.822178   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:03.822199   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:03.881429   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:03.881453   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:03.892554   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:03.892569   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:03.960792   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:03.952836   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.953779   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955515   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955829   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.957393   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:03.952836   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.953779   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955515   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955829   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.957393   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:06.461036   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:06.471459   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:06.471519   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:06.500164   54807 cri.go:89] found id: ""
	I1202 19:15:06.500178   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.500184   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:06.500190   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:06.500253   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:06.526532   54807 cri.go:89] found id: ""
	I1202 19:15:06.526545   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.526552   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:06.526558   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:06.526616   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:06.551534   54807 cri.go:89] found id: ""
	I1202 19:15:06.551553   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.551560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:06.551566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:06.551628   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:06.577486   54807 cri.go:89] found id: ""
	I1202 19:15:06.577500   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.577506   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:06.577512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:06.577570   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:06.607506   54807 cri.go:89] found id: ""
	I1202 19:15:06.607520   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.607529   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:06.607535   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:06.607663   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:06.632779   54807 cri.go:89] found id: ""
	I1202 19:15:06.632792   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.632799   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:06.632805   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:06.632866   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:06.656916   54807 cri.go:89] found id: ""
	I1202 19:15:06.656928   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.656936   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:06.656943   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:06.656953   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:06.721178   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:06.721197   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:06.733421   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:06.733437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:06.806706   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:06.798437   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.799136   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.800799   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.801507   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.803118   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:06.798437   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.799136   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.800799   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.801507   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.803118   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:06.806717   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:06.806728   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:06.870452   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:06.870471   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:09.403297   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:09.414259   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:09.414319   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:09.442090   54807 cri.go:89] found id: ""
	I1202 19:15:09.442103   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.442110   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:09.442115   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:09.442175   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:09.471784   54807 cri.go:89] found id: ""
	I1202 19:15:09.471797   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.471804   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:09.471809   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:09.471887   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:09.496688   54807 cri.go:89] found id: ""
	I1202 19:15:09.496701   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.496708   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:09.496714   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:09.496773   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:09.522932   54807 cri.go:89] found id: ""
	I1202 19:15:09.522946   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.522952   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:09.522957   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:09.523018   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:09.550254   54807 cri.go:89] found id: ""
	I1202 19:15:09.550268   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.550275   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:09.550280   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:09.550341   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:09.578955   54807 cri.go:89] found id: ""
	I1202 19:15:09.578968   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.578975   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:09.578980   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:09.579041   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:09.603797   54807 cri.go:89] found id: ""
	I1202 19:15:09.603812   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.603819   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:09.603827   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:09.603837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:09.660195   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:09.660215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:09.671581   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:09.671596   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:09.755982   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:09.748446   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.749204   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.750900   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.751203   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.752674   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:09.748446   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.749204   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.750900   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.751203   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.752674   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:09.755993   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:09.756013   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:09.820958   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:09.820977   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:12.349982   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:12.359890   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:12.359953   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:12.387716   54807 cri.go:89] found id: ""
	I1202 19:15:12.387729   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.387736   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:12.387741   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:12.387802   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:12.413168   54807 cri.go:89] found id: ""
	I1202 19:15:12.413182   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.413188   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:12.413194   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:12.413262   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:12.441234   54807 cri.go:89] found id: ""
	I1202 19:15:12.441247   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.441253   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:12.441262   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:12.441321   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:12.465660   54807 cri.go:89] found id: ""
	I1202 19:15:12.465673   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.465680   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:12.465689   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:12.465747   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:12.489519   54807 cri.go:89] found id: ""
	I1202 19:15:12.489532   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.489540   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:12.489545   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:12.489605   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:12.514756   54807 cri.go:89] found id: ""
	I1202 19:15:12.514770   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.514777   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:12.514782   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:12.514843   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:12.538845   54807 cri.go:89] found id: ""
	I1202 19:15:12.538858   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.538865   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:12.538872   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:12.538884   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:12.549453   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:12.549477   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:12.616294   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:12.608411   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.609095   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.610814   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.611359   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.612794   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:12.608411   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.609095   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.610814   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.611359   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.612794   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:12.616304   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:12.616315   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:12.679579   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:12.679598   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:12.712483   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:12.712499   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:15.277003   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:15.287413   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:15.287496   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:15.313100   54807 cri.go:89] found id: ""
	I1202 19:15:15.313113   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.313120   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:15.313135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:15.313194   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:15.339367   54807 cri.go:89] found id: ""
	I1202 19:15:15.339381   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.339387   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:15.339393   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:15.339463   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:15.364247   54807 cri.go:89] found id: ""
	I1202 19:15:15.364270   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.364277   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:15.364283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:15.364393   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:15.389379   54807 cri.go:89] found id: ""
	I1202 19:15:15.389393   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.389401   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:15.389412   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:15.389472   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:15.414364   54807 cri.go:89] found id: ""
	I1202 19:15:15.414378   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.414386   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:15.414391   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:15.414455   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:15.438995   54807 cri.go:89] found id: ""
	I1202 19:15:15.439009   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.439024   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:15.439030   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:15.439097   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:15.467973   54807 cri.go:89] found id: ""
	I1202 19:15:15.467986   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.467993   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:15.468001   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:15.468010   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:15.534212   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:15.526543   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.527142   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.528724   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.529277   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.530859   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:15.526543   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.527142   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.528724   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.529277   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.530859   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:15.534222   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:15.534233   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:15.602898   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:15.602917   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:15.634225   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:15.634242   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:15.693229   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:15.693247   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:18.205585   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:18.217019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:18.217080   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:18.243139   54807 cri.go:89] found id: ""
	I1202 19:15:18.243153   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.243160   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:18.243176   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:18.243234   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:18.266826   54807 cri.go:89] found id: ""
	I1202 19:15:18.266839   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.266846   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:18.266851   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:18.266911   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:18.291760   54807 cri.go:89] found id: ""
	I1202 19:15:18.291773   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.291781   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:18.291795   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:18.291853   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:18.315881   54807 cri.go:89] found id: ""
	I1202 19:15:18.315895   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.315902   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:18.315907   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:18.315963   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:18.354620   54807 cri.go:89] found id: ""
	I1202 19:15:18.354633   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.354640   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:18.354649   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:18.354708   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:18.378919   54807 cri.go:89] found id: ""
	I1202 19:15:18.378932   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.378939   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:18.378945   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:18.379003   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:18.403461   54807 cri.go:89] found id: ""
	I1202 19:15:18.403474   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.403482   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:18.403489   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:18.403499   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:18.460043   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:18.460062   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:18.471326   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:18.471343   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:18.533325   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:18.525005   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.525603   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527360   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527971   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.529742   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:18.525005   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.525603   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527360   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527971   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.529742   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:18.533335   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:18.533346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:18.595843   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:18.595862   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:21.128472   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:21.138623   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:21.138683   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:21.163008   54807 cri.go:89] found id: ""
	I1202 19:15:21.163021   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.163028   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:21.163039   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:21.163096   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:21.186917   54807 cri.go:89] found id: ""
	I1202 19:15:21.186930   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.186937   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:21.186942   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:21.187000   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:21.212853   54807 cri.go:89] found id: ""
	I1202 19:15:21.212866   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.212873   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:21.212878   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:21.212937   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:21.240682   54807 cri.go:89] found id: ""
	I1202 19:15:21.240695   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.240703   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:21.240708   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:21.240765   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:21.264693   54807 cri.go:89] found id: ""
	I1202 19:15:21.264706   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.264713   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:21.264718   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:21.264778   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:21.288193   54807 cri.go:89] found id: ""
	I1202 19:15:21.288207   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.288214   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:21.288219   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:21.288278   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:21.313950   54807 cri.go:89] found id: ""
	I1202 19:15:21.313964   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.313971   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:21.313979   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:21.313990   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:21.324612   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:21.324626   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:21.388157   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:21.380313   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.381141   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.382761   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.383081   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.384783   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:21.380313   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.381141   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.382761   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.383081   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.384783   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:21.388177   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:21.388188   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:21.451835   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:21.451853   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:21.480172   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:21.480187   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:24.037107   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:24.047291   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:24.047362   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:24.072397   54807 cri.go:89] found id: ""
	I1202 19:15:24.072411   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.072418   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:24.072424   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:24.072486   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:24.097793   54807 cri.go:89] found id: ""
	I1202 19:15:24.097807   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.097814   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:24.097819   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:24.097879   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:24.122934   54807 cri.go:89] found id: ""
	I1202 19:15:24.122947   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.122954   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:24.122960   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:24.123020   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:24.147849   54807 cri.go:89] found id: ""
	I1202 19:15:24.147863   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.147869   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:24.147875   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:24.147935   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:24.172919   54807 cri.go:89] found id: ""
	I1202 19:15:24.172932   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.172939   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:24.172944   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:24.173004   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:24.197266   54807 cri.go:89] found id: ""
	I1202 19:15:24.197280   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.197287   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:24.197293   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:24.197351   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:24.222541   54807 cri.go:89] found id: ""
	I1202 19:15:24.222555   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.222562   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:24.222572   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:24.222582   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:24.278762   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:24.278784   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:24.289861   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:24.289877   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:24.353810   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:24.345889   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.346412   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.347992   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.348533   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.350299   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:24.345889   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.346412   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.347992   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.348533   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.350299   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:24.353831   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:24.353842   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:24.416010   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:24.416029   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:26.947462   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:26.958975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:26.959033   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:26.992232   54807 cri.go:89] found id: ""
	I1202 19:15:26.992257   54807 logs.go:282] 0 containers: []
	W1202 19:15:26.992264   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:26.992270   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:26.992354   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:27.021036   54807 cri.go:89] found id: ""
	I1202 19:15:27.021049   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.021056   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:27.021062   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:27.021119   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:27.052008   54807 cri.go:89] found id: ""
	I1202 19:15:27.052022   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.052028   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:27.052034   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:27.052093   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:27.076184   54807 cri.go:89] found id: ""
	I1202 19:15:27.076197   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.076204   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:27.076209   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:27.076266   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:27.100296   54807 cri.go:89] found id: ""
	I1202 19:15:27.100308   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.100315   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:27.100355   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:27.100413   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:27.125762   54807 cri.go:89] found id: ""
	I1202 19:15:27.125776   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.125783   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:27.125788   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:27.125851   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:27.150224   54807 cri.go:89] found id: ""
	I1202 19:15:27.150237   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.150244   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:27.150252   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:27.150262   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:27.178321   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:27.178338   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:27.233465   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:27.233484   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:27.244423   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:27.244437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:27.311220   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:27.303476   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.303903   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.305730   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.306227   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.307716   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:27.303476   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.303903   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.305730   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.306227   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.307716   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:27.311235   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:27.311246   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:29.874091   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:29.884341   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:29.884402   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:29.909943   54807 cri.go:89] found id: ""
	I1202 19:15:29.909962   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.909970   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:29.909975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:29.910035   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:29.947534   54807 cri.go:89] found id: ""
	I1202 19:15:29.947547   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.947554   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:29.947559   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:29.947617   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:29.989319   54807 cri.go:89] found id: ""
	I1202 19:15:29.989335   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.989343   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:29.989349   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:29.989414   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:30.038828   54807 cri.go:89] found id: ""
	I1202 19:15:30.038842   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.038850   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:30.038856   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:30.038932   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:30.067416   54807 cri.go:89] found id: ""
	I1202 19:15:30.067432   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.067440   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:30.067446   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:30.067509   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:30.094866   54807 cri.go:89] found id: ""
	I1202 19:15:30.094881   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.094888   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:30.094896   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:30.094958   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:30.120930   54807 cri.go:89] found id: ""
	I1202 19:15:30.120959   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.120968   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:30.120977   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:30.120988   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:30.177165   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:30.177186   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:30.188251   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:30.188267   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:30.255176   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:30.247015   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.247757   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.249382   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.250104   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.251678   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:30.247015   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.247757   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.249382   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.250104   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.251678   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:30.255194   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:30.255205   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:30.323165   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:30.323189   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:32.854201   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:32.864404   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:32.864467   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:32.890146   54807 cri.go:89] found id: ""
	I1202 19:15:32.890160   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.890166   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:32.890172   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:32.890239   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:32.915189   54807 cri.go:89] found id: ""
	I1202 19:15:32.915202   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.915210   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:32.915215   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:32.915286   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:32.952949   54807 cri.go:89] found id: ""
	I1202 19:15:32.952962   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.952969   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:32.952975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:32.953031   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:32.986345   54807 cri.go:89] found id: ""
	I1202 19:15:32.986359   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.986366   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:32.986371   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:32.986435   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:33.010880   54807 cri.go:89] found id: ""
	I1202 19:15:33.010894   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.010902   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:33.010908   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:33.010966   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:33.039327   54807 cri.go:89] found id: ""
	I1202 19:15:33.039341   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.039348   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:33.039354   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:33.039412   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:33.064437   54807 cri.go:89] found id: ""
	I1202 19:15:33.064463   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.064470   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:33.064478   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:33.064488   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:33.120755   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:33.120773   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:33.132552   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:33.132575   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:33.199378   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:33.191204   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.191873   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.193517   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.194121   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.195812   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:33.191204   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.191873   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.193517   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.194121   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.195812   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:33.199389   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:33.199401   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:33.266899   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:33.266918   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:35.796024   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:35.807086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:35.807146   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:35.839365   54807 cri.go:89] found id: ""
	I1202 19:15:35.839378   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.839394   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:35.839400   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:35.839469   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:35.872371   54807 cri.go:89] found id: ""
	I1202 19:15:35.872385   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.872393   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:35.872398   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:35.872467   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:35.901242   54807 cri.go:89] found id: ""
	I1202 19:15:35.901255   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.901262   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:35.901268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:35.901326   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:35.936195   54807 cri.go:89] found id: ""
	I1202 19:15:35.936209   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.936215   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:35.936221   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:35.936282   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:35.965129   54807 cri.go:89] found id: ""
	I1202 19:15:35.965145   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.965153   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:35.965159   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:35.966675   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:35.998286   54807 cri.go:89] found id: ""
	I1202 19:15:35.998299   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.998306   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:35.998311   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:35.998371   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:36.024787   54807 cri.go:89] found id: ""
	I1202 19:15:36.024800   54807 logs.go:282] 0 containers: []
	W1202 19:15:36.024812   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:36.024820   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:36.024829   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:36.081130   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:36.081146   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:36.092692   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:36.092714   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:36.154814   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:36.146918   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.147704   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149430   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149884   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.151372   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:36.146918   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.147704   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149430   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149884   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.151372   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:36.154824   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:36.154837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:36.218034   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:36.218052   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:38.748085   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:38.758270   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:38.758328   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:38.786304   54807 cri.go:89] found id: ""
	I1202 19:15:38.786317   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.786325   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:38.786330   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:38.786389   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:38.811113   54807 cri.go:89] found id: ""
	I1202 19:15:38.811126   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.811134   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:38.811139   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:38.811223   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:38.836191   54807 cri.go:89] found id: ""
	I1202 19:15:38.836207   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.836214   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:38.836219   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:38.836278   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:38.860383   54807 cri.go:89] found id: ""
	I1202 19:15:38.860396   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.860403   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:38.860410   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:38.860469   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:38.887750   54807 cri.go:89] found id: ""
	I1202 19:15:38.887764   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.887770   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:38.887775   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:38.887834   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:38.914103   54807 cri.go:89] found id: ""
	I1202 19:15:38.914116   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.914123   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:38.914128   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:38.914184   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:38.950405   54807 cri.go:89] found id: ""
	I1202 19:15:38.950418   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.950425   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:38.950433   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:38.950442   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:39.016206   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:39.016225   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:39.026699   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:39.026714   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:39.090183   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:39.082441   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.083071   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.084892   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.085258   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.086741   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:39.082441   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.083071   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.084892   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.085258   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.086741   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:39.090195   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:39.090206   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:39.151533   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:39.151551   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:41.681058   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:41.691353   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:41.691417   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:41.716684   54807 cri.go:89] found id: ""
	I1202 19:15:41.716697   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.716704   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:41.716710   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:41.716768   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:41.742096   54807 cri.go:89] found id: ""
	I1202 19:15:41.742110   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.742117   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:41.742122   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:41.742182   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:41.766652   54807 cri.go:89] found id: ""
	I1202 19:15:41.766665   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.766672   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:41.766678   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:41.766741   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:41.791517   54807 cri.go:89] found id: ""
	I1202 19:15:41.791531   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.791538   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:41.791544   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:41.791600   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:41.817700   54807 cri.go:89] found id: ""
	I1202 19:15:41.817713   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.817720   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:41.817725   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:41.817786   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:41.846078   54807 cri.go:89] found id: ""
	I1202 19:15:41.846092   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.846099   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:41.846104   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:41.846161   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:41.874235   54807 cri.go:89] found id: ""
	I1202 19:15:41.874249   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.874258   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:41.874268   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:41.874278   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:41.942286   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:41.942307   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:41.989723   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:41.989740   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:42.047707   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:42.047728   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:42.061053   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:42.061073   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:42.138885   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:42.129369   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.130125   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.131195   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.132151   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.133038   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:42.129369   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.130125   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.131195   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.132151   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.133038   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:44.639103   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:44.648984   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:44.649044   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:44.673076   54807 cri.go:89] found id: ""
	I1202 19:15:44.673091   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.673098   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:44.673105   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:44.673162   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:44.696488   54807 cri.go:89] found id: ""
	I1202 19:15:44.696501   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.696507   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:44.696512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:44.696568   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:44.722164   54807 cri.go:89] found id: ""
	I1202 19:15:44.722177   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.722184   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:44.722190   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:44.722254   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:44.745410   54807 cri.go:89] found id: ""
	I1202 19:15:44.745424   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.745431   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:44.745437   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:44.745494   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:44.769317   54807 cri.go:89] found id: ""
	I1202 19:15:44.769330   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.769337   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:44.769342   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:44.769404   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:44.794282   54807 cri.go:89] found id: ""
	I1202 19:15:44.794295   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.794302   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:44.794308   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:44.794369   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:44.818676   54807 cri.go:89] found id: ""
	I1202 19:15:44.818689   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.818696   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:44.818703   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:44.818734   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:44.829491   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:44.829506   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:44.892401   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:44.884881   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.885617   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887285   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887590   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.889113   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:44.884881   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.885617   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887285   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887590   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.889113   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:44.892427   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:44.892438   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:44.961436   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:44.961457   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:45.004301   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:45.004340   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:47.597359   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:47.607380   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:47.607436   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:47.632361   54807 cri.go:89] found id: ""
	I1202 19:15:47.632375   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.632382   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:47.632387   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:47.632443   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:47.657478   54807 cri.go:89] found id: ""
	I1202 19:15:47.657491   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.657498   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:47.657504   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:47.657565   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:47.681973   54807 cri.go:89] found id: ""
	I1202 19:15:47.681987   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.681994   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:47.681999   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:47.682054   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:47.705968   54807 cri.go:89] found id: ""
	I1202 19:15:47.705982   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.705988   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:47.705994   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:47.706051   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:47.730910   54807 cri.go:89] found id: ""
	I1202 19:15:47.730923   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.730930   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:47.730935   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:47.730992   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:47.757739   54807 cri.go:89] found id: ""
	I1202 19:15:47.757752   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.757759   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:47.757764   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:47.757820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:47.782566   54807 cri.go:89] found id: ""
	I1202 19:15:47.782579   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.782586   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:47.782594   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:47.782605   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:47.845974   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:47.837753   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.838402   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840192   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840777   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.842546   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:47.837753   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.838402   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840192   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840777   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.842546   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:47.845983   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:47.845994   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:47.913035   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:47.913054   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:47.952076   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:47.952091   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:48.023577   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:48.023596   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:50.534902   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:50.544843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:50.544904   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:50.573435   54807 cri.go:89] found id: ""
	I1202 19:15:50.573449   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.573456   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:50.573462   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:50.573524   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:50.598029   54807 cri.go:89] found id: ""
	I1202 19:15:50.598043   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.598051   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:50.598056   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:50.598115   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:50.623452   54807 cri.go:89] found id: ""
	I1202 19:15:50.623465   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.623472   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:50.623478   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:50.623536   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:50.648357   54807 cri.go:89] found id: ""
	I1202 19:15:50.648371   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.648378   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:50.648383   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:50.648441   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:50.672042   54807 cri.go:89] found id: ""
	I1202 19:15:50.672056   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.672063   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:50.672068   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:50.672125   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:50.697434   54807 cri.go:89] found id: ""
	I1202 19:15:50.697448   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.697455   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:50.697461   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:50.697525   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:50.728291   54807 cri.go:89] found id: ""
	I1202 19:15:50.728305   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.728312   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:50.728340   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:50.728351   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:50.790193   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:50.781764   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.782494   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784122   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784535   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.786186   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:50.781764   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.782494   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784122   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784535   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.786186   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:50.790203   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:50.790214   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:50.855933   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:50.855951   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:50.884682   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:50.884698   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:50.949404   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:50.949423   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:53.461440   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:53.471831   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:53.471906   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:53.496591   54807 cri.go:89] found id: ""
	I1202 19:15:53.496604   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.496611   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:53.496617   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:53.496674   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:53.521087   54807 cri.go:89] found id: ""
	I1202 19:15:53.521103   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.521111   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:53.521116   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:53.521174   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:53.545148   54807 cri.go:89] found id: ""
	I1202 19:15:53.545161   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.545168   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:53.545173   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:53.545231   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:53.570884   54807 cri.go:89] found id: ""
	I1202 19:15:53.570898   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.570904   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:53.570910   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:53.570972   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:53.597220   54807 cri.go:89] found id: ""
	I1202 19:15:53.597234   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.597241   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:53.597247   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:53.597326   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:53.626817   54807 cri.go:89] found id: ""
	I1202 19:15:53.626830   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.626837   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:53.626843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:53.626901   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:53.656721   54807 cri.go:89] found id: ""
	I1202 19:15:53.656734   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.656741   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:53.656750   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:53.656762   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:53.721841   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:53.712824   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.713681   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715369   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715912   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.717503   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:53.712824   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.713681   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715369   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715912   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.717503   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:53.721850   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:53.721862   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:53.785783   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:53.785801   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:53.815658   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:53.815673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:53.873221   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:53.873238   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:56.384447   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:56.394843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:56.394909   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:56.425129   54807 cri.go:89] found id: ""
	I1202 19:15:56.425142   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.425149   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:56.425154   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:56.425212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:56.451236   54807 cri.go:89] found id: ""
	I1202 19:15:56.451250   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.451257   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:56.451263   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:56.451327   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:56.476585   54807 cri.go:89] found id: ""
	I1202 19:15:56.476599   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.476606   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:56.476611   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:56.476669   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:56.501814   54807 cri.go:89] found id: ""
	I1202 19:15:56.501828   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.501834   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:56.501840   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:56.501900   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:56.530866   54807 cri.go:89] found id: ""
	I1202 19:15:56.530879   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.530886   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:56.530891   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:56.530959   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:56.555014   54807 cri.go:89] found id: ""
	I1202 19:15:56.555029   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.555036   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:56.555042   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:56.555102   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:56.582644   54807 cri.go:89] found id: ""
	I1202 19:15:56.582657   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.582664   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:56.582672   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:56.582684   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:56.637937   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:56.637955   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:56.648656   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:56.648672   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:56.716929   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:56.708385   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.708981   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.710802   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.711377   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.712990   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:56.708385   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.708981   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.710802   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.711377   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.712990   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:56.716939   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:56.716950   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:56.783854   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:56.783880   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:59.312498   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:59.322671   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:59.322730   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:59.346425   54807 cri.go:89] found id: ""
	I1202 19:15:59.346439   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.346446   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:59.346452   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:59.346515   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:59.371199   54807 cri.go:89] found id: ""
	I1202 19:15:59.371212   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.371219   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:59.371224   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:59.371286   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:59.398444   54807 cri.go:89] found id: ""
	I1202 19:15:59.398458   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.398465   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:59.398470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:59.398528   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:59.423109   54807 cri.go:89] found id: ""
	I1202 19:15:59.423122   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.423129   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:59.423135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:59.423193   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:59.448440   54807 cri.go:89] found id: ""
	I1202 19:15:59.448454   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.448461   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:59.448469   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:59.448539   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:59.472288   54807 cri.go:89] found id: ""
	I1202 19:15:59.472302   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.472309   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:59.472315   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:59.472396   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:59.501959   54807 cri.go:89] found id: ""
	I1202 19:15:59.501973   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.501980   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:59.501987   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:59.501999   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:59.562783   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:59.555099   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.555718   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557259   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557814   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.559465   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:59.555099   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.555718   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557259   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557814   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.559465   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:59.562800   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:59.562811   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:59.626612   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:59.626631   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:59.655068   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:59.655083   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:59.713332   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:59.713350   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:02.224451   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:02.234704   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:02.234765   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:02.260861   54807 cri.go:89] found id: ""
	I1202 19:16:02.260875   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.260882   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:02.260888   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:02.260951   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:02.286334   54807 cri.go:89] found id: ""
	I1202 19:16:02.286354   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.286362   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:02.286367   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:02.286426   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:02.310961   54807 cri.go:89] found id: ""
	I1202 19:16:02.310975   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.310982   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:02.310988   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:02.311050   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:02.339645   54807 cri.go:89] found id: ""
	I1202 19:16:02.339658   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.339665   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:02.339670   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:02.339727   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:02.364456   54807 cri.go:89] found id: ""
	I1202 19:16:02.364471   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.364478   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:02.364484   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:02.364547   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:02.394258   54807 cri.go:89] found id: ""
	I1202 19:16:02.394272   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.394278   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:02.394284   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:02.394342   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:02.418723   54807 cri.go:89] found id: ""
	I1202 19:16:02.418737   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.418744   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:02.418752   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:02.418762   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:02.482679   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:02.474517   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.475378   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.476933   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.477486   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.479002   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:02.474517   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.475378   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.476933   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.477486   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.479002   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:02.482690   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:02.482700   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:02.548276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:02.548295   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:02.578369   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:02.578386   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:02.636563   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:02.636581   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:05.147857   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:05.158273   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:05.158332   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:05.198133   54807 cri.go:89] found id: ""
	I1202 19:16:05.198149   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.198161   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:05.198167   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:05.198230   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:05.229481   54807 cri.go:89] found id: ""
	I1202 19:16:05.229494   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.229508   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:05.229513   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:05.229573   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:05.255940   54807 cri.go:89] found id: ""
	I1202 19:16:05.255954   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.255961   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:05.255967   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:05.256027   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:05.281978   54807 cri.go:89] found id: ""
	I1202 19:16:05.281991   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.281998   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:05.282004   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:05.282063   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:05.310511   54807 cri.go:89] found id: ""
	I1202 19:16:05.310525   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.310533   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:05.310539   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:05.310605   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:05.340114   54807 cri.go:89] found id: ""
	I1202 19:16:05.340127   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.340135   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:05.340140   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:05.340198   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:05.366243   54807 cri.go:89] found id: ""
	I1202 19:16:05.366256   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.366263   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:05.366271   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:05.366283   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:05.393993   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:05.394009   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:05.450279   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:05.450299   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:05.461585   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:05.461602   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:05.528601   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:05.520245   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.521019   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.522739   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.523260   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.524914   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:05.520245   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.521019   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.522739   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.523260   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.524914   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:05.528610   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:05.528621   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:08.097252   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:08.107731   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:08.107792   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:08.134215   54807 cri.go:89] found id: ""
	I1202 19:16:08.134240   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.134248   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:08.134255   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:08.134327   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:08.160174   54807 cri.go:89] found id: ""
	I1202 19:16:08.160188   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.160195   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:08.160200   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:08.160259   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:08.188835   54807 cri.go:89] found id: ""
	I1202 19:16:08.188849   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.188856   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:08.188871   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:08.188930   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:08.222672   54807 cri.go:89] found id: ""
	I1202 19:16:08.222686   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.222703   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:08.222710   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:08.222774   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:08.252685   54807 cri.go:89] found id: ""
	I1202 19:16:08.252699   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.252705   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:08.252711   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:08.252767   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:08.281659   54807 cri.go:89] found id: ""
	I1202 19:16:08.281672   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.281679   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:08.281685   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:08.281757   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:08.306909   54807 cri.go:89] found id: ""
	I1202 19:16:08.306922   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.306929   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:08.306936   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:08.306947   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:08.363919   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:08.363938   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:08.375138   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:08.375154   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:08.443392   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:08.435786   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.436517   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438269   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438769   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.439921   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:08.435786   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.436517   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438269   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438769   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.439921   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:08.443414   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:08.443428   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:08.507474   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:08.507492   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:11.037665   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:11.050056   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:11.050130   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:11.076993   54807 cri.go:89] found id: ""
	I1202 19:16:11.077008   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.077015   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:11.077021   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:11.077088   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:11.104370   54807 cri.go:89] found id: ""
	I1202 19:16:11.104384   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.104393   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:11.104399   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:11.104463   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:11.132145   54807 cri.go:89] found id: ""
	I1202 19:16:11.132160   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.132168   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:11.132174   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:11.132235   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:11.158847   54807 cri.go:89] found id: ""
	I1202 19:16:11.158861   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.158868   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:11.158874   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:11.158934   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:11.198715   54807 cri.go:89] found id: ""
	I1202 19:16:11.198729   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.198736   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:11.198741   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:11.198804   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:11.230867   54807 cri.go:89] found id: ""
	I1202 19:16:11.230886   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.230893   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:11.230899   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:11.230957   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:11.259807   54807 cri.go:89] found id: ""
	I1202 19:16:11.259821   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.259828   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:11.259836   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:11.259846   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:11.287151   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:11.287167   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:11.344009   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:11.344032   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:11.354412   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:11.354433   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:11.420896   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:11.412603   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.413632   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415146   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415437   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.416861   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:11.412603   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.413632   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415146   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415437   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.416861   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:11.420906   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:11.420917   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:13.984421   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:13.995238   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:13.995302   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:14.021325   54807 cri.go:89] found id: ""
	I1202 19:16:14.021338   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.021345   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:14.021350   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:14.021407   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:14.047264   54807 cri.go:89] found id: ""
	I1202 19:16:14.047278   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.047285   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:14.047291   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:14.047355   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:14.071231   54807 cri.go:89] found id: ""
	I1202 19:16:14.071245   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.071252   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:14.071257   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:14.071315   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:14.096289   54807 cri.go:89] found id: ""
	I1202 19:16:14.096302   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.096309   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:14.096315   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:14.096397   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:14.122522   54807 cri.go:89] found id: ""
	I1202 19:16:14.122535   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.122542   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:14.122548   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:14.122608   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:14.151408   54807 cri.go:89] found id: ""
	I1202 19:16:14.151422   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.151429   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:14.151435   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:14.151496   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:14.182327   54807 cri.go:89] found id: ""
	I1202 19:16:14.182340   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.182347   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:14.182355   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:14.182365   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:14.246777   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:14.246796   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:14.262093   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:14.262108   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:14.326058   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:14.317581   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.318458   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320176   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320802   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.322292   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:14.317581   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.318458   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320176   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320802   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.322292   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:14.326068   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:14.326080   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:14.388559   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:14.388578   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:16.920108   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:16.930319   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:16.930382   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:16.955799   54807 cri.go:89] found id: ""
	I1202 19:16:16.955813   54807 logs.go:282] 0 containers: []
	W1202 19:16:16.955820   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:16.955825   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:16.955882   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:16.982139   54807 cri.go:89] found id: ""
	I1202 19:16:16.982153   54807 logs.go:282] 0 containers: []
	W1202 19:16:16.982160   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:16.982165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:16.982223   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:17.007837   54807 cri.go:89] found id: ""
	I1202 19:16:17.007851   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.007857   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:17.007863   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:17.007933   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:17.034216   54807 cri.go:89] found id: ""
	I1202 19:16:17.034229   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.034236   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:17.034241   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:17.034298   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:17.063913   54807 cri.go:89] found id: ""
	I1202 19:16:17.063927   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.063934   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:17.063939   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:17.063997   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:17.088826   54807 cri.go:89] found id: ""
	I1202 19:16:17.088840   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.088847   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:17.088853   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:17.088913   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:17.114356   54807 cri.go:89] found id: ""
	I1202 19:16:17.114370   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.114376   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:17.114384   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:17.114394   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:17.171571   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:17.171591   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:17.192662   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:17.192677   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:17.265860   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:17.257716   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.258547   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260144   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260824   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.262474   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:17.257716   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.258547   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260144   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260824   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.262474   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:17.265870   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:17.265883   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:17.329636   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:17.329654   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:19.857139   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:19.867414   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:19.867471   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:19.891736   54807 cri.go:89] found id: ""
	I1202 19:16:19.891750   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.891757   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:19.891762   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:19.891819   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:19.916840   54807 cri.go:89] found id: ""
	I1202 19:16:19.916854   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.916861   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:19.916881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:19.916938   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:19.941623   54807 cri.go:89] found id: ""
	I1202 19:16:19.941636   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.941643   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:19.941649   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:19.941706   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:19.973037   54807 cri.go:89] found id: ""
	I1202 19:16:19.973051   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.973059   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:19.973065   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:19.973134   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:20.000748   54807 cri.go:89] found id: ""
	I1202 19:16:20.000765   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.000773   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:20.000780   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:20.000851   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:20.025854   54807 cri.go:89] found id: ""
	I1202 19:16:20.025868   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.025875   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:20.025881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:20.025940   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:20.052281   54807 cri.go:89] found id: ""
	I1202 19:16:20.052296   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.052304   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:20.052312   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:20.052346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:20.120511   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:20.111945   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.112719   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.114519   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.115208   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.116979   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:20.111945   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.112719   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.114519   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.115208   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.116979   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:20.120542   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:20.120557   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:20.192068   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:20.192088   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:20.232059   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:20.232074   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:20.287505   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:20.287527   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:22.798885   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:22.808880   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:22.808947   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:22.838711   54807 cri.go:89] found id: ""
	I1202 19:16:22.838736   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.838744   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:22.838750   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:22.838815   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:22.866166   54807 cri.go:89] found id: ""
	I1202 19:16:22.866180   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.866187   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:22.866192   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:22.866250   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:22.890456   54807 cri.go:89] found id: ""
	I1202 19:16:22.890470   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.890484   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:22.890490   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:22.890554   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:22.915548   54807 cri.go:89] found id: ""
	I1202 19:16:22.915562   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.915578   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:22.915585   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:22.915643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:22.940011   54807 cri.go:89] found id: ""
	I1202 19:16:22.940025   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.940032   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:22.940037   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:22.940093   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:22.965647   54807 cri.go:89] found id: ""
	I1202 19:16:22.965660   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.965670   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:22.965677   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:22.965744   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:22.994566   54807 cri.go:89] found id: ""
	I1202 19:16:22.994580   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.994587   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:22.994595   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:22.994611   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:23.050953   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:23.050973   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:23.061610   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:23.061624   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:23.127525   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:23.119520   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.120161   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.121888   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.122530   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.124179   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:23.119520   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.120161   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.121888   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.122530   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.124179   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:23.127534   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:23.127546   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:23.194603   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:23.194639   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:25.725656   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:25.735521   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:25.735580   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:25.760626   54807 cri.go:89] found id: ""
	I1202 19:16:25.760640   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.760647   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:25.760652   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:25.760711   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:25.786443   54807 cri.go:89] found id: ""
	I1202 19:16:25.786457   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.786464   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:25.786470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:25.786529   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:25.813975   54807 cri.go:89] found id: ""
	I1202 19:16:25.813989   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.813996   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:25.814001   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:25.814059   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:25.839899   54807 cri.go:89] found id: ""
	I1202 19:16:25.839912   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.839920   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:25.839925   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:25.839983   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:25.869299   54807 cri.go:89] found id: ""
	I1202 19:16:25.869312   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.869319   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:25.869325   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:25.869384   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:25.894364   54807 cri.go:89] found id: ""
	I1202 19:16:25.894379   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.894385   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:25.894391   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:25.894448   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:25.919717   54807 cri.go:89] found id: ""
	I1202 19:16:25.919733   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.919741   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:25.919748   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:25.919759   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:25.988177   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:25.979290   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.979818   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.981491   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.982030   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.983829   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:25.979290   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.979818   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.981491   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.982030   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.983829   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:25.988188   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:25.988198   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:26.052787   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:26.052806   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:26.081027   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:26.081042   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:26.138061   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:26.138079   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:28.650000   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:28.660481   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:28.660541   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:28.685594   54807 cri.go:89] found id: ""
	I1202 19:16:28.685608   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.685616   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:28.685621   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:28.685679   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:28.710399   54807 cri.go:89] found id: ""
	I1202 19:16:28.710412   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.710419   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:28.710425   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:28.710481   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:28.735520   54807 cri.go:89] found id: ""
	I1202 19:16:28.735533   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.735546   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:28.735551   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:28.735607   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:28.762423   54807 cri.go:89] found id: ""
	I1202 19:16:28.762436   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.762443   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:28.762449   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:28.762515   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:28.791746   54807 cri.go:89] found id: ""
	I1202 19:16:28.791760   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.791767   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:28.791772   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:28.791831   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:28.818359   54807 cri.go:89] found id: ""
	I1202 19:16:28.818372   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.818379   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:28.818386   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:28.818443   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:28.846465   54807 cri.go:89] found id: ""
	I1202 19:16:28.846479   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.846486   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:28.846494   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:28.846503   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:28.903412   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:28.903430   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:28.914210   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:28.914267   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:28.978428   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:28.970543   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.971044   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.972803   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.973286   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.974839   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:28.970543   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.971044   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.972803   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.973286   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.974839   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:28.978439   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:28.978450   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:29.041343   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:29.041363   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:31.570595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:31.583500   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:31.583573   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:31.611783   54807 cri.go:89] found id: ""
	I1202 19:16:31.611796   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.611805   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:31.611811   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:31.611868   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:31.639061   54807 cri.go:89] found id: ""
	I1202 19:16:31.639074   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.639081   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:31.639086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:31.639152   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:31.664706   54807 cri.go:89] found id: ""
	I1202 19:16:31.664719   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.664726   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:31.664732   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:31.664789   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:31.688725   54807 cri.go:89] found id: ""
	I1202 19:16:31.688739   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.688746   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:31.688751   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:31.688807   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:31.713308   54807 cri.go:89] found id: ""
	I1202 19:16:31.713321   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.713328   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:31.713333   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:31.713391   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:31.737960   54807 cri.go:89] found id: ""
	I1202 19:16:31.737973   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.737980   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:31.737985   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:31.738041   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:31.766035   54807 cri.go:89] found id: ""
	I1202 19:16:31.766048   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.766055   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:31.766063   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:31.766078   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:31.821307   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:31.821327   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:31.832103   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:31.832118   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:31.894804   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:31.886409   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.887232   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.888852   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.889449   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.891143   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:31.886409   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.887232   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.888852   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.889449   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.891143   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:31.894814   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:31.894824   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:31.958623   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:31.958641   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:34.494532   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:34.504804   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:34.504861   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:34.534339   54807 cri.go:89] found id: ""
	I1202 19:16:34.534359   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.534366   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:34.534372   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:34.534430   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:34.559181   54807 cri.go:89] found id: ""
	I1202 19:16:34.559194   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.559203   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:34.559208   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:34.559266   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:34.583120   54807 cri.go:89] found id: ""
	I1202 19:16:34.583133   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.583139   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:34.583145   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:34.583245   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:34.608256   54807 cri.go:89] found id: ""
	I1202 19:16:34.608269   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.608276   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:34.608282   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:34.608365   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:34.632733   54807 cri.go:89] found id: ""
	I1202 19:16:34.632747   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.632754   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:34.632759   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:34.632821   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:34.663293   54807 cri.go:89] found id: ""
	I1202 19:16:34.663307   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.663314   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:34.663320   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:34.663376   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:34.686842   54807 cri.go:89] found id: ""
	I1202 19:16:34.686856   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.686863   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:34.686871   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:34.686881   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:34.697549   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:34.697564   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:34.764406   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:34.756417   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.757141   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.758783   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.759285   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.760837   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:34.756417   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.757141   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.758783   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.759285   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.760837   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:34.764416   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:34.764427   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:34.827201   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:34.827223   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:34.854552   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:34.854570   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:37.413003   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:37.423382   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:37.423441   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:37.459973   54807 cri.go:89] found id: ""
	I1202 19:16:37.459987   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.459994   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:37.460000   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:37.460062   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:37.494488   54807 cri.go:89] found id: ""
	I1202 19:16:37.494503   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.494510   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:37.494515   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:37.494584   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:37.519270   54807 cri.go:89] found id: ""
	I1202 19:16:37.519283   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.519290   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:37.519295   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:37.519351   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:37.545987   54807 cri.go:89] found id: ""
	I1202 19:16:37.546001   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.546008   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:37.546013   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:37.546069   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:37.574348   54807 cri.go:89] found id: ""
	I1202 19:16:37.574362   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.574369   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:37.574375   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:37.574437   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:37.600075   54807 cri.go:89] found id: ""
	I1202 19:16:37.600089   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.600096   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:37.600102   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:37.600167   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:37.625421   54807 cri.go:89] found id: ""
	I1202 19:16:37.625434   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.625443   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:37.625450   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:37.625460   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:37.688980   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:37.689000   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:37.719329   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:37.719344   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:37.778206   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:37.778225   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:37.789133   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:37.789148   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:37.856498   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:37.848835   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.849672   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851302   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851842   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.853113   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:37.848835   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.849672   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851302   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851842   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.853113   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:40.358183   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:40.368449   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:40.368509   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:40.392705   54807 cri.go:89] found id: ""
	I1202 19:16:40.392721   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.392728   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:40.392734   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:40.392796   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:40.417408   54807 cri.go:89] found id: ""
	I1202 19:16:40.417422   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.417429   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:40.417435   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:40.417493   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:40.458012   54807 cri.go:89] found id: ""
	I1202 19:16:40.458026   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.458033   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:40.458039   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:40.458094   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:40.498315   54807 cri.go:89] found id: ""
	I1202 19:16:40.498328   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.498335   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:40.498341   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:40.498402   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:40.523770   54807 cri.go:89] found id: ""
	I1202 19:16:40.523784   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.523792   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:40.523797   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:40.523865   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:40.549124   54807 cri.go:89] found id: ""
	I1202 19:16:40.549137   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.549144   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:40.549149   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:40.549207   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:40.573667   54807 cri.go:89] found id: ""
	I1202 19:16:40.573680   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.573688   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:40.573696   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:40.573708   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:40.629671   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:40.629688   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:40.640745   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:40.640760   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:40.706165   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:40.697573   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.698405   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700020   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700368   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.702012   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:40.697573   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.698405   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700020   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700368   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.702012   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:40.706175   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:40.706186   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:40.775737   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:40.775755   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:43.307135   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:43.317487   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:43.317553   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:43.342709   54807 cri.go:89] found id: ""
	I1202 19:16:43.342722   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.342730   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:43.342735   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:43.342793   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:43.367380   54807 cri.go:89] found id: ""
	I1202 19:16:43.367393   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.367400   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:43.367406   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:43.367462   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:43.394678   54807 cri.go:89] found id: ""
	I1202 19:16:43.394691   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.394699   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:43.394704   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:43.394761   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:43.421130   54807 cri.go:89] found id: ""
	I1202 19:16:43.421144   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.421151   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:43.421156   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:43.421212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:43.454728   54807 cri.go:89] found id: ""
	I1202 19:16:43.454741   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.454749   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:43.454754   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:43.454810   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:43.491457   54807 cri.go:89] found id: ""
	I1202 19:16:43.491470   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.491477   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:43.491482   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:43.491537   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:43.515943   54807 cri.go:89] found id: ""
	I1202 19:16:43.515957   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.515964   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:43.515972   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:43.515982   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:43.579953   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:43.579972   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:43.608617   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:43.608632   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:43.666586   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:43.666604   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:43.677358   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:43.677374   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:43.741646   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:43.733470   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.734235   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.735923   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.736656   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.738348   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:43.733470   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.734235   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.735923   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.736656   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.738348   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:46.243365   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:46.255599   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:46.255658   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:46.280357   54807 cri.go:89] found id: ""
	I1202 19:16:46.280369   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.280376   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:46.280382   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:46.280444   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:46.304610   54807 cri.go:89] found id: ""
	I1202 19:16:46.304623   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.304630   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:46.304635   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:46.304692   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:46.328944   54807 cri.go:89] found id: ""
	I1202 19:16:46.328957   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.328963   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:46.328968   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:46.329027   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:46.357896   54807 cri.go:89] found id: ""
	I1202 19:16:46.357909   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.357916   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:46.357923   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:46.357981   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:46.381601   54807 cri.go:89] found id: ""
	I1202 19:16:46.381613   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.381620   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:46.381626   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:46.381687   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:46.406928   54807 cri.go:89] found id: ""
	I1202 19:16:46.406942   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.406949   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:46.406954   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:46.407009   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:46.449373   54807 cri.go:89] found id: ""
	I1202 19:16:46.449386   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.449393   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:46.449401   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:46.449411   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:46.516162   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:46.516180   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:46.527166   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:46.527183   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:46.590201   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:46.582136   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.583309   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.584090   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585115   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585711   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:46.582136   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.583309   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.584090   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585115   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585711   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:46.590211   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:46.590221   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:46.652574   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:46.652593   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:49.180131   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:49.190665   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:49.190729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:49.215295   54807 cri.go:89] found id: ""
	I1202 19:16:49.215308   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.215315   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:49.215321   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:49.215382   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:49.241898   54807 cri.go:89] found id: ""
	I1202 19:16:49.241912   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.241919   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:49.241925   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:49.241986   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:49.266638   54807 cri.go:89] found id: ""
	I1202 19:16:49.266651   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.266658   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:49.266664   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:49.266719   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:49.292478   54807 cri.go:89] found id: ""
	I1202 19:16:49.292496   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.292506   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:49.292512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:49.292589   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:49.318280   54807 cri.go:89] found id: ""
	I1202 19:16:49.318293   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.318300   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:49.318306   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:49.318373   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:49.351760   54807 cri.go:89] found id: ""
	I1202 19:16:49.351774   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.351787   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:49.351793   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:49.351854   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:49.376513   54807 cri.go:89] found id: ""
	I1202 19:16:49.376536   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.376543   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:49.376551   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:49.376563   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:49.448960   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:49.448987   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:49.482655   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:49.482673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:49.541305   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:49.541322   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:49.552971   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:49.552988   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:49.618105   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:49.609636   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.610374   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612148   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612903   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.614517   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:49.609636   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.610374   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612148   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612903   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.614517   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:52.119791   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:52.130607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:52.130669   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:52.155643   54807 cri.go:89] found id: ""
	I1202 19:16:52.155656   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.155663   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:52.155669   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:52.155729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:52.179230   54807 cri.go:89] found id: ""
	I1202 19:16:52.179244   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.179253   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:52.179259   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:52.179316   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:52.203772   54807 cri.go:89] found id: ""
	I1202 19:16:52.203785   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.203792   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:52.203798   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:52.203852   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:52.236168   54807 cri.go:89] found id: ""
	I1202 19:16:52.236183   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.236190   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:52.236196   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:52.236257   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:52.260979   54807 cri.go:89] found id: ""
	I1202 19:16:52.260995   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.261003   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:52.261008   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:52.261063   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:52.284287   54807 cri.go:89] found id: ""
	I1202 19:16:52.284299   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.284306   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:52.284312   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:52.284385   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:52.310376   54807 cri.go:89] found id: ""
	I1202 19:16:52.310390   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.310397   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:52.310405   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:52.310415   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:52.366619   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:52.366636   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:52.377556   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:52.377572   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:52.453208   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:52.444029   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.444937   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.446866   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.447608   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.449367   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:52.444029   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.444937   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.446866   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.447608   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.449367   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:52.453218   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:52.453229   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:52.524196   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:52.524214   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:55.052717   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:55.063878   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:55.063943   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:55.089568   54807 cri.go:89] found id: ""
	I1202 19:16:55.089582   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.089588   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:55.089594   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:55.089658   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:55.116741   54807 cri.go:89] found id: ""
	I1202 19:16:55.116755   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.116762   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:55.116768   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:55.116825   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:55.142748   54807 cri.go:89] found id: ""
	I1202 19:16:55.142761   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.142768   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:55.142774   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:55.142836   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:55.167341   54807 cri.go:89] found id: ""
	I1202 19:16:55.167354   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.167361   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:55.167367   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:55.167424   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:55.194118   54807 cri.go:89] found id: ""
	I1202 19:16:55.194132   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.194139   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:55.194144   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:55.194201   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:55.218379   54807 cri.go:89] found id: ""
	I1202 19:16:55.218393   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.218400   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:55.218406   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:55.218465   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:55.243035   54807 cri.go:89] found id: ""
	I1202 19:16:55.243048   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.243055   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:55.243063   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:55.243073   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:55.310493   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:55.302627   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.303246   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.304790   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.305303   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.306777   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:55.302627   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.303246   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.304790   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.305303   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.306777   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:55.310504   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:55.310517   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:55.373914   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:55.373933   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:55.405157   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:55.405172   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:55.473565   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:55.473583   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:57.986363   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:57.996902   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:57.996969   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:58.023028   54807 cri.go:89] found id: ""
	I1202 19:16:58.023042   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.023049   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:58.023055   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:58.023113   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:58.049927   54807 cri.go:89] found id: ""
	I1202 19:16:58.049941   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.049947   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:58.049953   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:58.050013   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:58.078428   54807 cri.go:89] found id: ""
	I1202 19:16:58.078448   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.078456   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:58.078461   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:58.078516   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:58.105365   54807 cri.go:89] found id: ""
	I1202 19:16:58.105377   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.105385   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:58.105390   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:58.105448   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:58.129444   54807 cri.go:89] found id: ""
	I1202 19:16:58.129458   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.129465   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:58.129470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:58.129531   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:58.157574   54807 cri.go:89] found id: ""
	I1202 19:16:58.157588   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.157594   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:58.157607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:58.157670   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:58.182028   54807 cri.go:89] found id: ""
	I1202 19:16:58.182042   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.182049   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:58.182057   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:58.182067   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:58.241166   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:58.241184   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:58.252367   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:58.252383   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:58.319914   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:58.311929   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.312824   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.314498   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.315073   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.316444   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:58.311929   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.312824   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.314498   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.315073   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.316444   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:58.319925   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:58.319937   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:58.381228   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:58.381246   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:00.909644   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:00.920924   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:00.921037   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:00.947793   54807 cri.go:89] found id: ""
	I1202 19:17:00.947812   54807 logs.go:282] 0 containers: []
	W1202 19:17:00.947820   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:00.947828   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:00.947900   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:00.975539   54807 cri.go:89] found id: ""
	I1202 19:17:00.975553   54807 logs.go:282] 0 containers: []
	W1202 19:17:00.975561   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:00.975566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:00.975629   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:01.002532   54807 cri.go:89] found id: ""
	I1202 19:17:01.002549   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.002560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:01.002566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:01.002636   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:01.032211   54807 cri.go:89] found id: ""
	I1202 19:17:01.032226   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.032233   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:01.032239   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:01.032302   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:01.059398   54807 cri.go:89] found id: ""
	I1202 19:17:01.059413   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.059420   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:01.059426   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:01.059486   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:01.091722   54807 cri.go:89] found id: ""
	I1202 19:17:01.091740   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.091746   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:01.091752   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:01.091816   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:01.117849   54807 cri.go:89] found id: ""
	I1202 19:17:01.117864   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.117871   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:01.117879   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:01.117893   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:01.191972   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:01.182202   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.183119   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.185191   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.186030   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.187874   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:01.182202   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.183119   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.185191   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.186030   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.187874   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:01.191984   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:01.191997   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:01.260783   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:01.260806   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:01.290665   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:01.290683   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:01.348633   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:01.348653   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:03.860845   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:03.871899   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:03.871966   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:03.899158   54807 cri.go:89] found id: ""
	I1202 19:17:03.899172   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.899179   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:03.899185   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:03.899244   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:03.925147   54807 cri.go:89] found id: ""
	I1202 19:17:03.925161   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.925168   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:03.925174   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:03.925235   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:03.955130   54807 cri.go:89] found id: ""
	I1202 19:17:03.955143   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.955150   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:03.955156   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:03.955215   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:03.983272   54807 cri.go:89] found id: ""
	I1202 19:17:03.983286   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.983294   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:03.983300   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:03.983371   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:04.009435   54807 cri.go:89] found id: ""
	I1202 19:17:04.009449   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.009456   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:04.009463   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:04.009523   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:04.037346   54807 cri.go:89] found id: ""
	I1202 19:17:04.037360   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.037368   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:04.037374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:04.037433   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:04.066662   54807 cri.go:89] found id: ""
	I1202 19:17:04.066675   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.066682   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:04.066690   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:04.066701   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:04.125350   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:04.125369   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:04.136698   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:04.136716   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:04.206327   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:04.198119   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.198761   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.200411   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.201024   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.202642   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:04.198119   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.198761   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.200411   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.201024   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.202642   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:04.206338   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:04.206353   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:04.274588   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:04.274608   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:06.806010   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:06.817189   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:06.817256   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:06.843114   54807 cri.go:89] found id: ""
	I1202 19:17:06.843129   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.843136   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:06.843142   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:06.843218   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:06.873921   54807 cri.go:89] found id: ""
	I1202 19:17:06.873947   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.873955   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:06.873961   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:06.874045   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:06.900636   54807 cri.go:89] found id: ""
	I1202 19:17:06.900651   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.900658   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:06.900664   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:06.900724   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:06.928484   54807 cri.go:89] found id: ""
	I1202 19:17:06.928504   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.928512   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:06.928518   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:06.928583   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:06.956137   54807 cri.go:89] found id: ""
	I1202 19:17:06.956170   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.956179   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:06.956184   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:06.956258   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:06.987383   54807 cri.go:89] found id: ""
	I1202 19:17:06.987408   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.987416   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:06.987422   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:06.987495   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:07.013712   54807 cri.go:89] found id: ""
	I1202 19:17:07.013726   54807 logs.go:282] 0 containers: []
	W1202 19:17:07.013733   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:07.013741   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:07.013756   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:07.076937   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:07.076955   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:07.106847   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:07.106863   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:07.164565   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:07.164584   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:07.177132   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:07.177154   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:07.245572   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:07.237375   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.238081   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.239663   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.240205   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.241966   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:07.237375   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.238081   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.239663   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.240205   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.241966   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:09.745822   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:09.756122   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:09.756180   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:09.784649   54807 cri.go:89] found id: ""
	I1202 19:17:09.784663   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.784670   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:09.784675   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:09.784732   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:09.809632   54807 cri.go:89] found id: ""
	I1202 19:17:09.809655   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.809662   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:09.809668   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:09.809733   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:09.839403   54807 cri.go:89] found id: ""
	I1202 19:17:09.839425   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.839433   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:09.839439   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:09.839504   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:09.868977   54807 cri.go:89] found id: ""
	I1202 19:17:09.868991   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.868999   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:09.869004   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:09.869064   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:09.894156   54807 cri.go:89] found id: ""
	I1202 19:17:09.894170   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.894176   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:09.894182   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:09.894237   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:09.919174   54807 cri.go:89] found id: ""
	I1202 19:17:09.919188   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.919195   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:09.919200   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:09.919261   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:09.944620   54807 cri.go:89] found id: ""
	I1202 19:17:09.944632   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.944639   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:09.944647   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:09.944657   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:10.004028   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:10.004049   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:10.015962   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:10.015979   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:10.086133   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:10.078544   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.079196   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.080924   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.081452   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.082613   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:10.078544   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.079196   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.080924   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.081452   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.082613   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:10.086143   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:10.086153   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:10.148419   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:10.148437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:12.676458   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:12.687083   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:12.687155   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:12.712590   54807 cri.go:89] found id: ""
	I1202 19:17:12.712604   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.712611   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:12.712616   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:12.712674   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:12.737565   54807 cri.go:89] found id: ""
	I1202 19:17:12.737578   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.737585   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:12.737591   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:12.737648   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:12.762201   54807 cri.go:89] found id: ""
	I1202 19:17:12.762216   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.762223   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:12.762228   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:12.762288   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:12.786736   54807 cri.go:89] found id: ""
	I1202 19:17:12.786750   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.786758   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:12.786763   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:12.786825   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:12.811994   54807 cri.go:89] found id: ""
	I1202 19:17:12.812008   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.812015   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:12.812020   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:12.812078   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:12.838580   54807 cri.go:89] found id: ""
	I1202 19:17:12.838593   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.838600   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:12.838605   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:12.838659   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:12.863652   54807 cri.go:89] found id: ""
	I1202 19:17:12.863665   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.863672   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:12.863679   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:12.863689   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:12.918766   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:12.918784   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:12.930406   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:12.930428   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:13.000633   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:12.992135   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.992970   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994592   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994977   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.996559   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:12.992135   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.992970   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994592   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994977   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.996559   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:13.000643   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:13.000655   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:13.065384   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:13.065403   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:15.594382   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:15.604731   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:15.604795   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:15.634332   54807 cri.go:89] found id: ""
	I1202 19:17:15.634345   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.634353   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:15.634358   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:15.634434   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:15.663126   54807 cri.go:89] found id: ""
	I1202 19:17:15.663141   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.663148   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:15.663153   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:15.663217   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:15.699033   54807 cri.go:89] found id: ""
	I1202 19:17:15.699051   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.699059   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:15.699065   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:15.699121   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:15.727044   54807 cri.go:89] found id: ""
	I1202 19:17:15.727057   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.727065   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:15.727071   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:15.727129   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:15.754131   54807 cri.go:89] found id: ""
	I1202 19:17:15.754152   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.754159   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:15.754165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:15.754224   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:15.778325   54807 cri.go:89] found id: ""
	I1202 19:17:15.778338   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.778345   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:15.778350   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:15.778407   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:15.803363   54807 cri.go:89] found id: ""
	I1202 19:17:15.803376   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.803383   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:15.803391   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:15.803403   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:15.814039   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:15.814055   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:15.885494   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:15.877151   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.877774   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.878766   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880347   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880955   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:15.877151   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.877774   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.878766   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880347   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880955   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:15.885505   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:15.885516   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:15.947276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:15.947295   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:15.979963   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:15.979981   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:18.538313   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:18.548423   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:18.548490   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:18.571700   54807 cri.go:89] found id: ""
	I1202 19:17:18.571714   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.571721   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:18.571726   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:18.571784   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:18.600197   54807 cri.go:89] found id: ""
	I1202 19:17:18.600211   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.600219   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:18.600224   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:18.600279   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:18.628309   54807 cri.go:89] found id: ""
	I1202 19:17:18.628341   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.628348   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:18.628353   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:18.628440   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:18.654241   54807 cri.go:89] found id: ""
	I1202 19:17:18.654255   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.654263   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:18.654268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:18.654325   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:18.690109   54807 cri.go:89] found id: ""
	I1202 19:17:18.690123   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.690130   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:18.690135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:18.690194   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:18.719625   54807 cri.go:89] found id: ""
	I1202 19:17:18.719638   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.719646   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:18.719651   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:18.719713   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:18.753094   54807 cri.go:89] found id: ""
	I1202 19:17:18.753108   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.753116   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:18.753124   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:18.753135   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:18.782592   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:18.782608   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:18.837738   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:18.837757   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:18.848921   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:18.848937   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:18.918012   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:18.908756   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.909735   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.911590   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.912295   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.913925   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:18.908756   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.909735   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.911590   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.912295   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.913925   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:18.918023   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:18.918034   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:21.481252   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:21.491493   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:21.491550   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:21.515967   54807 cri.go:89] found id: ""
	I1202 19:17:21.515980   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.515987   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:21.515993   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:21.516049   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:21.545239   54807 cri.go:89] found id: ""
	I1202 19:17:21.545256   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.545263   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:21.545268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:21.545349   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:21.574561   54807 cri.go:89] found id: ""
	I1202 19:17:21.574575   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.574582   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:21.574588   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:21.574643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:21.600546   54807 cri.go:89] found id: ""
	I1202 19:17:21.600567   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.600575   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:21.600581   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:21.600647   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:21.625602   54807 cri.go:89] found id: ""
	I1202 19:17:21.625616   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.625623   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:21.625629   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:21.625691   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:21.650573   54807 cri.go:89] found id: ""
	I1202 19:17:21.650586   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.650593   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:21.650599   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:21.650655   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:21.680099   54807 cri.go:89] found id: ""
	I1202 19:17:21.680113   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.680120   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:21.680128   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:21.680155   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:21.750582   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:21.750601   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:21.762564   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:21.762580   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:21.827497   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:21.819360   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.820080   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.821817   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.822472   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.824049   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:21.819360   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.820080   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.821817   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.822472   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.824049   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:21.827507   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:21.827518   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:21.889794   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:21.889812   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:24.421754   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:24.432162   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:24.432233   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:24.456800   54807 cri.go:89] found id: ""
	I1202 19:17:24.456814   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.456821   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:24.456826   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:24.456901   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:24.481502   54807 cri.go:89] found id: ""
	I1202 19:17:24.481516   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.481523   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:24.481529   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:24.481587   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:24.505876   54807 cri.go:89] found id: ""
	I1202 19:17:24.505918   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.505925   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:24.505931   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:24.505990   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:24.530651   54807 cri.go:89] found id: ""
	I1202 19:17:24.530665   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.530673   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:24.530689   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:24.530749   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:24.556247   54807 cri.go:89] found id: ""
	I1202 19:17:24.556260   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.556277   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:24.556283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:24.556391   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:24.585748   54807 cri.go:89] found id: ""
	I1202 19:17:24.585761   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.585769   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:24.585774   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:24.585833   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:24.610350   54807 cri.go:89] found id: ""
	I1202 19:17:24.610363   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.610370   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:24.610377   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:24.610388   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:24.680866   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:24.664630   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.665302   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667106   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667645   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.669330   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:24.664630   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.665302   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667106   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667645   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.669330   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:24.680876   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:24.680887   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:24.756955   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:24.756975   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:24.784854   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:24.784869   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:24.849848   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:24.849872   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:27.361613   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:27.375047   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:27.375145   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:27.399753   54807 cri.go:89] found id: ""
	I1202 19:17:27.399767   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.399774   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:27.399780   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:27.399838   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:27.430016   54807 cri.go:89] found id: ""
	I1202 19:17:27.430030   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.430037   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:27.430043   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:27.430102   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:27.455165   54807 cri.go:89] found id: ""
	I1202 19:17:27.455178   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.455186   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:27.455191   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:27.455251   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:27.481353   54807 cri.go:89] found id: ""
	I1202 19:17:27.481367   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.481374   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:27.481380   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:27.481437   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:27.505602   54807 cri.go:89] found id: ""
	I1202 19:17:27.505615   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.505622   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:27.505627   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:27.505685   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:27.531062   54807 cri.go:89] found id: ""
	I1202 19:17:27.531075   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.531082   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:27.531087   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:27.531143   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:27.556614   54807 cri.go:89] found id: ""
	I1202 19:17:27.556628   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.556635   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:27.556642   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:27.556653   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:27.623535   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:27.615982   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.616782   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618353   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618650   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.620147   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:27.615982   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.616782   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618353   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618650   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.620147   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:27.623546   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:27.623557   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:27.692276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:27.692294   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:27.728468   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:27.728489   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:27.790653   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:27.790670   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:30.302100   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:30.313066   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:30.313144   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:30.340123   54807 cri.go:89] found id: ""
	I1202 19:17:30.340137   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.340144   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:30.340149   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:30.340208   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:30.365806   54807 cri.go:89] found id: ""
	I1202 19:17:30.365820   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.365835   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:30.365841   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:30.365904   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:30.391688   54807 cri.go:89] found id: ""
	I1202 19:17:30.391701   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.391708   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:30.391714   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:30.391771   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:30.416982   54807 cri.go:89] found id: ""
	I1202 19:17:30.416996   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.417013   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:30.417019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:30.417117   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:30.443139   54807 cri.go:89] found id: ""
	I1202 19:17:30.443153   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.443162   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:30.443168   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:30.443226   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:30.468557   54807 cri.go:89] found id: ""
	I1202 19:17:30.468571   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.468579   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:30.468584   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:30.468641   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:30.494467   54807 cri.go:89] found id: ""
	I1202 19:17:30.494480   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.494488   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:30.494502   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:30.494515   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:30.551986   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:30.552005   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:30.563168   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:30.563184   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:30.628562   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:30.620608   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.621322   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623022   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623454   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.625008   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:30.620608   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.621322   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623022   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623454   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.625008   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:30.628573   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:30.628584   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:30.691460   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:30.691478   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:33.223672   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:33.234425   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:33.234485   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:33.262500   54807 cri.go:89] found id: ""
	I1202 19:17:33.262514   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.262521   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:33.262527   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:33.262590   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:33.287888   54807 cri.go:89] found id: ""
	I1202 19:17:33.287902   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.287921   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:33.287926   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:33.287995   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:33.314581   54807 cri.go:89] found id: ""
	I1202 19:17:33.314594   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.314601   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:33.314607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:33.314671   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:33.338734   54807 cri.go:89] found id: ""
	I1202 19:17:33.338747   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.338755   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:33.338760   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:33.338818   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:33.363343   54807 cri.go:89] found id: ""
	I1202 19:17:33.363356   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.363363   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:33.363369   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:33.363425   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:33.388256   54807 cri.go:89] found id: ""
	I1202 19:17:33.388270   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.388277   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:33.388283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:33.388360   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:33.412424   54807 cri.go:89] found id: ""
	I1202 19:17:33.412449   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.412456   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:33.412465   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:33.412475   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:33.467817   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:33.467835   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:33.479194   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:33.479209   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:33.548484   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:33.540299   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.540851   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.542579   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.543074   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.544871   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:33.540299   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.540851   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.542579   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.543074   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.544871   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:33.548494   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:33.548505   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:33.612889   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:33.612909   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:36.146985   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:36.158019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:36.158079   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:36.188906   54807 cri.go:89] found id: ""
	I1202 19:17:36.188919   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.188932   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:36.188938   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:36.188996   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:36.213390   54807 cri.go:89] found id: ""
	I1202 19:17:36.213404   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.213411   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:36.213416   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:36.213481   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:36.242801   54807 cri.go:89] found id: ""
	I1202 19:17:36.242814   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.242822   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:36.242827   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:36.242882   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:36.269121   54807 cri.go:89] found id: ""
	I1202 19:17:36.269142   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.269149   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:36.269155   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:36.269212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:36.295182   54807 cri.go:89] found id: ""
	I1202 19:17:36.295196   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.295203   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:36.295208   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:36.295265   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:36.320684   54807 cri.go:89] found id: ""
	I1202 19:17:36.320698   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.320705   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:36.320711   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:36.320783   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:36.347524   54807 cri.go:89] found id: ""
	I1202 19:17:36.347537   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.347545   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:36.347553   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:36.347564   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:36.358349   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:36.358364   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:36.419970   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:36.412170   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.412950   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414493   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414845   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.416368   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:36.412170   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.412950   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414493   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414845   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.416368   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:36.419980   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:36.419991   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:36.482180   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:36.482199   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:36.511443   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:36.511458   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:39.067437   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:39.077694   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:39.077763   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:39.102742   54807 cri.go:89] found id: ""
	I1202 19:17:39.102755   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.102762   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:39.102768   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:39.102824   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:39.127352   54807 cri.go:89] found id: ""
	I1202 19:17:39.127365   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.127371   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:39.127376   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:39.127433   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:39.155704   54807 cri.go:89] found id: ""
	I1202 19:17:39.155717   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.155725   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:39.155730   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:39.155793   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:39.181102   54807 cri.go:89] found id: ""
	I1202 19:17:39.181121   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.181128   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:39.181133   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:39.181193   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:39.204855   54807 cri.go:89] found id: ""
	I1202 19:17:39.204869   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.204876   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:39.204881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:39.204936   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:39.228875   54807 cri.go:89] found id: ""
	I1202 19:17:39.228889   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.228896   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:39.228901   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:39.228961   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:39.254647   54807 cri.go:89] found id: ""
	I1202 19:17:39.254661   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.254668   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:39.254681   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:39.254696   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:39.266611   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:39.266628   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:39.329195   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:39.321572   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.322010   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323495   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323811   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.325283   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:39.321572   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.322010   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323495   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323811   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.325283   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:39.329204   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:39.329215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:39.390326   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:39.390345   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:39.419151   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:39.419176   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:41.975528   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:41.989057   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:41.989132   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:42.018363   54807 cri.go:89] found id: ""
	I1202 19:17:42.018376   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.018384   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:42.018390   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:42.018453   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:42.045176   54807 cri.go:89] found id: ""
	I1202 19:17:42.045192   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.045200   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:42.045206   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:42.045290   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:42.075758   54807 cri.go:89] found id: ""
	I1202 19:17:42.075773   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.075781   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:42.075787   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:42.075856   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:42.111739   54807 cri.go:89] found id: ""
	I1202 19:17:42.111754   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.111760   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:42.111767   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:42.111829   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:42.141340   54807 cri.go:89] found id: ""
	I1202 19:17:42.141358   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.141368   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:42.141374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:42.141453   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:42.171125   54807 cri.go:89] found id: ""
	I1202 19:17:42.171140   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.171159   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:42.171166   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:42.171236   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:42.200254   54807 cri.go:89] found id: ""
	I1202 19:17:42.200272   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.200280   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:42.200292   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:42.200307   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:42.256751   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:42.256772   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:42.269101   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:42.269118   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:42.336339   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:42.328432   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.329095   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.330828   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.331361   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.332853   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:42.328432   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.329095   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.330828   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.331361   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.332853   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:42.336350   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:42.336361   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:42.397522   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:42.397540   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:44.932481   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:44.944310   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:44.944439   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:44.992545   54807 cri.go:89] found id: ""
	I1202 19:17:44.992561   54807 logs.go:282] 0 containers: []
	W1202 19:17:44.992568   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:44.992574   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:44.992643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:45.041739   54807 cri.go:89] found id: ""
	I1202 19:17:45.041756   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.041764   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:45.041770   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:45.041849   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:45.083378   54807 cri.go:89] found id: ""
	I1202 19:17:45.083394   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.083402   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:45.083407   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:45.083483   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:45.119179   54807 cri.go:89] found id: ""
	I1202 19:17:45.119206   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.119214   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:45.119220   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:45.119340   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:45.156515   54807 cri.go:89] found id: ""
	I1202 19:17:45.156574   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.156583   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:45.156590   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:45.156760   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:45.195862   54807 cri.go:89] found id: ""
	I1202 19:17:45.195877   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.195885   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:45.195892   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:45.195968   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:45.229425   54807 cri.go:89] found id: ""
	I1202 19:17:45.229448   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.229457   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:45.229466   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:45.229477   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:45.293109   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:45.293125   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:45.303969   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:45.303985   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:45.371653   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:45.362842   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.363639   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.365366   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.366008   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.367671   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:45.362842   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.363639   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.365366   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.366008   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.367671   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:45.371662   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:45.371673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:45.436450   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:45.436469   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:47.967684   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:47.979933   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:47.980001   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:48.006489   54807 cri.go:89] found id: ""
	I1202 19:17:48.006503   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.006511   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:48.006517   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:48.006580   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:48.035723   54807 cri.go:89] found id: ""
	I1202 19:17:48.035737   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.035745   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:48.035760   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:48.035820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:48.065220   54807 cri.go:89] found id: ""
	I1202 19:17:48.065233   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.065251   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:48.065260   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:48.065332   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:48.088782   54807 cri.go:89] found id: ""
	I1202 19:17:48.088796   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.088803   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:48.088809   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:48.088865   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:48.113775   54807 cri.go:89] found id: ""
	I1202 19:17:48.113788   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.113799   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:48.113808   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:48.113867   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:48.140235   54807 cri.go:89] found id: ""
	I1202 19:17:48.140248   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.140254   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:48.140260   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:48.140315   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:48.166089   54807 cri.go:89] found id: ""
	I1202 19:17:48.166102   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.166108   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:48.166116   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:48.166126   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:48.192826   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:48.192842   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:48.248078   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:48.248098   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:48.258722   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:48.258737   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:48.323436   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:48.316144   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.316536   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318116   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318415   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.319885   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:48.316144   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.316536   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318116   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318415   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.319885   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:48.323445   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:48.323456   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:50.885477   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:50.895878   54807 kubeadm.go:602] duration metric: took 4m3.997047772s to restartPrimaryControlPlane
	W1202 19:17:50.895945   54807 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1202 19:17:50.896022   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 19:17:51.304711   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:17:51.317725   54807 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 19:17:51.325312   54807 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 19:17:51.325381   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:17:51.332895   54807 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 19:17:51.332904   54807 kubeadm.go:158] found existing configuration files:
	
	I1202 19:17:51.332954   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:17:51.340776   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 19:17:51.340830   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 19:17:51.348141   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:17:51.355804   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 19:17:51.355867   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:17:51.363399   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:17:51.371055   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 19:17:51.371110   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:17:51.378528   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:17:51.386558   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 19:17:51.386618   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:17:51.394349   54807 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 19:17:51.435339   54807 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 19:17:51.435446   54807 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 19:17:51.512672   54807 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 19:17:51.512738   54807 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 19:17:51.512772   54807 kubeadm.go:319] OS: Linux
	I1202 19:17:51.512816   54807 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 19:17:51.512863   54807 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 19:17:51.512909   54807 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 19:17:51.512961   54807 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 19:17:51.513009   54807 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 19:17:51.513055   54807 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 19:17:51.513099   54807 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 19:17:51.513146   54807 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 19:17:51.513190   54807 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 19:17:51.580412   54807 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 19:17:51.580517   54807 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 19:17:51.580607   54807 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 19:17:51.588752   54807 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 19:17:51.594117   54807 out.go:252]   - Generating certificates and keys ...
	I1202 19:17:51.594201   54807 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 19:17:51.594273   54807 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 19:17:51.594354   54807 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 19:17:51.594424   54807 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 19:17:51.594494   54807 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 19:17:51.594547   54807 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 19:17:51.594610   54807 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 19:17:51.594671   54807 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 19:17:51.594744   54807 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 19:17:51.594818   54807 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 19:17:51.594855   54807 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 19:17:51.594910   54807 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 19:17:51.705531   54807 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 19:17:51.854203   54807 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 19:17:52.029847   54807 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 19:17:52.545269   54807 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 19:17:52.727822   54807 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 19:17:52.728412   54807 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 19:17:52.730898   54807 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 19:17:52.734122   54807 out.go:252]   - Booting up control plane ...
	I1202 19:17:52.734222   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 19:17:52.734305   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 19:17:52.734375   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 19:17:52.754118   54807 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 19:17:52.754386   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 19:17:52.762146   54807 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 19:17:52.762405   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 19:17:52.762460   54807 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 19:17:52.891581   54807 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 19:17:52.891694   54807 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 19:21:52.892779   54807 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001197768s
	I1202 19:21:52.892808   54807 kubeadm.go:319] 
	I1202 19:21:52.892871   54807 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 19:21:52.892903   54807 kubeadm.go:319] 	- The kubelet is not running
	I1202 19:21:52.893025   54807 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 19:21:52.893030   54807 kubeadm.go:319] 
	I1202 19:21:52.893133   54807 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 19:21:52.893170   54807 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 19:21:52.893200   54807 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 19:21:52.893203   54807 kubeadm.go:319] 
	I1202 19:21:52.897451   54807 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 19:21:52.897878   54807 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 19:21:52.897986   54807 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 19:21:52.898220   54807 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 19:21:52.898225   54807 kubeadm.go:319] 
	I1202 19:21:52.898299   54807 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1202 19:21:52.898412   54807 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001197768s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1202 19:21:52.898501   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 19:21:53.323346   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:21:53.337542   54807 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 19:21:53.337600   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:21:53.345331   54807 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 19:21:53.345341   54807 kubeadm.go:158] found existing configuration files:
	
	I1202 19:21:53.345394   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:21:53.352948   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 19:21:53.353002   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 19:21:53.360251   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:21:53.367769   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 19:21:53.367833   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:21:53.375319   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:21:53.383107   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 19:21:53.383164   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:21:53.390823   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:21:53.398923   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 19:21:53.398982   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:21:53.406858   54807 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 19:21:53.455640   54807 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 19:21:53.455689   54807 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 19:21:53.530940   54807 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 19:21:53.531008   54807 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 19:21:53.531042   54807 kubeadm.go:319] OS: Linux
	I1202 19:21:53.531086   54807 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 19:21:53.531133   54807 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 19:21:53.531179   54807 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 19:21:53.531226   54807 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 19:21:53.531273   54807 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 19:21:53.531320   54807 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 19:21:53.531364   54807 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 19:21:53.531410   54807 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 19:21:53.531455   54807 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 19:21:53.605461   54807 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 19:21:53.605584   54807 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 19:21:53.605706   54807 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 19:21:53.611090   54807 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 19:21:53.616552   54807 out.go:252]   - Generating certificates and keys ...
	I1202 19:21:53.616667   54807 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 19:21:53.616734   54807 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 19:21:53.616826   54807 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 19:21:53.616887   54807 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 19:21:53.616955   54807 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 19:21:53.617008   54807 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 19:21:53.617070   54807 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 19:21:53.617132   54807 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 19:21:53.617207   54807 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 19:21:53.617278   54807 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 19:21:53.617314   54807 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 19:21:53.617369   54807 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 19:21:53.704407   54807 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 19:21:53.921613   54807 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 19:21:54.521217   54807 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 19:21:54.609103   54807 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 19:21:54.800380   54807 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 19:21:54.800923   54807 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 19:21:54.803676   54807 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 19:21:54.806989   54807 out.go:252]   - Booting up control plane ...
	I1202 19:21:54.807091   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 19:21:54.807173   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 19:21:54.807243   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 19:21:54.831648   54807 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 19:21:54.831750   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 19:21:54.839547   54807 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 19:21:54.840014   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 19:21:54.840081   54807 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 19:21:54.986075   54807 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 19:21:54.986189   54807 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 19:25:54.986676   54807 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001082452s
	I1202 19:25:54.986700   54807 kubeadm.go:319] 
	I1202 19:25:54.986752   54807 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 19:25:54.986782   54807 kubeadm.go:319] 	- The kubelet is not running
	I1202 19:25:54.986880   54807 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 19:25:54.986884   54807 kubeadm.go:319] 
	I1202 19:25:54.986982   54807 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 19:25:54.987011   54807 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 19:25:54.987040   54807 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 19:25:54.987043   54807 kubeadm.go:319] 
	I1202 19:25:54.991498   54807 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 19:25:54.991923   54807 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 19:25:54.992031   54807 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 19:25:54.992264   54807 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 19:25:54.992269   54807 kubeadm.go:319] 
	I1202 19:25:54.992355   54807 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1202 19:25:54.992407   54807 kubeadm.go:403] duration metric: took 12m8.130118214s to StartCluster
	I1202 19:25:54.992437   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:25:54.992498   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:25:55.018059   54807 cri.go:89] found id: ""
	I1202 19:25:55.018073   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.018079   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:25:55.018085   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:25:55.018141   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:25:55.046728   54807 cri.go:89] found id: ""
	I1202 19:25:55.046741   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.046749   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:25:55.046755   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:25:55.046820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:25:55.073607   54807 cri.go:89] found id: ""
	I1202 19:25:55.073621   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.073629   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:25:55.073638   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:25:55.073698   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:25:55.098149   54807 cri.go:89] found id: ""
	I1202 19:25:55.098163   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.098170   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:25:55.098175   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:25:55.098231   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:25:55.126700   54807 cri.go:89] found id: ""
	I1202 19:25:55.126714   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.126721   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:25:55.126727   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:25:55.126783   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:25:55.151684   54807 cri.go:89] found id: ""
	I1202 19:25:55.151697   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.151704   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:25:55.151718   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:25:55.151776   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:25:55.179814   54807 cri.go:89] found id: ""
	I1202 19:25:55.179827   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.179834   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:25:55.179842   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:25:55.179852   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:25:55.209677   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:25:55.209693   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:25:55.267260   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:25:55.267277   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:25:55.278280   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:25:55.278301   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:25:55.341995   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:25:55.334367   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.334759   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336266   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336611   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.338027   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:25:55.334367   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.334759   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336266   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336611   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.338027   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:25:55.342006   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:25:55.342016   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	W1202 19:25:55.404636   54807 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1202 19:25:55.404681   54807 out.go:285] * 
	W1202 19:25:55.404792   54807 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 19:25:55.404837   54807 out.go:285] * 
	W1202 19:25:55.406981   54807 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 19:25:55.412566   54807 out.go:203] 
	W1202 19:25:55.416194   54807 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 19:25:55.416239   54807 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1202 19:25:55.416259   54807 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1202 19:25:55.420152   54807 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578599853Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578662622Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578725071Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578783803Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578855024Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578924119Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578991820Z" level=info msg="runtime interface created"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579043832Z" level=info msg="created NRI interface"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579105847Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579207451Z" level=info msg="Connect containerd service"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579595759Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.580416453Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590441353Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590507150Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590537673Z" level=info msg="Start subscribing containerd event"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590591277Z" level=info msg="Start recovering state"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614386130Z" level=info msg="Start event monitor"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614577326Z" level=info msg="Start cni network conf syncer for default"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614677601Z" level=info msg="Start streaming server"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614762451Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614968071Z" level=info msg="runtime interface starting up..."
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615037774Z" level=info msg="starting plugins..."
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615100272Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615329048Z" level=info msg="containerd successfully booted in 0.058232s"
	Dec 02 19:13:45 functional-449836 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:25:56.646151   21638 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:56.646789   21638 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:56.648439   21638 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:56.649138   21638 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:56.650690   21638 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:25:56 up  1:08,  0 user,  load average: 0.03, 0.16, 0.33
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:25:53 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:25:54 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 02 19:25:54 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:54 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:54 functional-449836 kubelet[21441]: E1202 19:25:54.216694   21441 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:25:54 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:25:54 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:25:54 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 02 19:25:54 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:54 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:54 functional-449836 kubelet[21447]: E1202 19:25:54.984542   21447 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:25:54 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:25:54 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:25:55 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 02 19:25:55 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:55 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:55 functional-449836 kubelet[21545]: E1202 19:25:55.733113   21545 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:25:55 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:25:55 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:25:56 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 02 19:25:56 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:56 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:56 functional-449836 kubelet[21593]: E1202 19:25:56.480645   21593 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:25:56 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:25:56 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (327.568959ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ExtraConfig (734.82s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth (2.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-449836 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:825: (dbg) Non-zero exit: kubectl --context functional-449836 get po -l tier=control-plane -n kube-system -o=json: exit status 1 (65.133479ms)

                                                
                                                
-- stdout --
	{
	    "apiVersion": "v1",
	    "items": [],
	    "kind": "List",
	    "metadata": {
	        "resourceVersion": ""
	    }
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:827: failed to get components. args "kubectl --context functional-449836 get po -l tier=control-plane -n kube-system -o=json": exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (301.554248ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                          ARGS                                                                           │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image   │ functional-224594 image ls --format yaml --alsologtostderr                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ ssh     │ functional-224594 ssh pgrep buildkitd                                                                                                                   │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ image   │ functional-224594 image ls --format json --alsologtostderr                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image ls --format table --alsologtostderr                                                                                             │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image build -t localhost/my-image:functional-224594 testdata/build --alsologtostderr                                                  │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ image   │ functional-224594 image ls                                                                                                                              │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ delete  │ -p functional-224594                                                                                                                                    │ functional-224594 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │ 02 Dec 25 18:58 UTC │
	│ start   │ -p functional-449836 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 18:58 UTC │                     │
	│ start   │ -p functional-449836 --alsologtostderr -v=8                                                                                                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:07 UTC │                     │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:3.1                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:3.3                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add registry.k8s.io/pause:latest                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache add minikube-local-cache-test:functional-449836                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ functional-449836 cache delete minikube-local-cache-test:functional-449836                                                                              │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.3                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ list                                                                                                                                                    │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl images                                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl rmi registry.k8s.io/pause:latest                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	│ cache   │ functional-449836 cache reload                                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                        │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                     │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ kubectl │ functional-449836 kubectl -- --context functional-449836 get pods                                                                                       │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	│ start   │ -p functional-449836 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	└─────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:13:42
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:13:42.762704   54807 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:13:42.762827   54807 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:13:42.762831   54807 out.go:374] Setting ErrFile to fd 2...
	I1202 19:13:42.762834   54807 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:13:42.763078   54807 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:13:42.763410   54807 out.go:368] Setting JSON to false
	I1202 19:13:42.764228   54807 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":3359,"bootTime":1764699464,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:13:42.764287   54807 start.go:143] virtualization:  
	I1202 19:13:42.767748   54807 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:13:42.771595   54807 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:13:42.771638   54807 notify.go:221] Checking for updates...
	I1202 19:13:42.777727   54807 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:13:42.780738   54807 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:13:42.783655   54807 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:13:42.786554   54807 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:13:42.789556   54807 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:13:42.793178   54807 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:13:42.793273   54807 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:13:42.817932   54807 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:13:42.818037   54807 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:13:42.893670   54807 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-02 19:13:42.884370868 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:13:42.893764   54807 docker.go:319] overlay module found
	I1202 19:13:42.896766   54807 out.go:179] * Using the docker driver based on existing profile
	I1202 19:13:42.899559   54807 start.go:309] selected driver: docker
	I1202 19:13:42.899567   54807 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:42.899671   54807 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:13:42.899770   54807 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:13:42.952802   54807 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-02 19:13:42.943962699 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:13:42.953225   54807 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1202 19:13:42.953247   54807 cni.go:84] Creating CNI manager for ""
	I1202 19:13:42.953303   54807 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:13:42.953342   54807 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:42.958183   54807 out.go:179] * Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	I1202 19:13:42.960983   54807 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 19:13:42.963884   54807 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 19:13:42.968058   54807 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:13:42.968252   54807 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 19:13:42.989666   54807 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 19:13:42.989677   54807 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 19:13:43.031045   54807 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 19:13:43.240107   54807 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 19:13:43.240267   54807 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 19:13:43.240445   54807 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240540   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 19:13:43.240557   54807 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 118.031µs
	I1202 19:13:43.240570   54807 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 19:13:43.240584   54807 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240616   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 19:13:43.240621   54807 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 38.835µs
	I1202 19:13:43.240626   54807 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 19:13:43.240809   54807 cache.go:243] Successfully downloaded all kic artifacts
	I1202 19:13:43.240835   54807 start.go:360] acquireMachinesLock for functional-449836: {Name:mk8999fdfa518fc15358d07431fe9bec286a035e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240864   54807 start.go:364] duration metric: took 20.397µs to acquireMachinesLock for "functional-449836"
	I1202 19:13:43.240875   54807 start.go:96] Skipping create...Using existing machine configuration
	I1202 19:13:43.240879   54807 fix.go:54] fixHost starting: 
	I1202 19:13:43.241152   54807 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:13:43.241336   54807 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241393   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 19:13:43.241400   54807 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 69.973µs
	I1202 19:13:43.241406   54807 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 19:13:43.241456   54807 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241496   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 19:13:43.241501   54807 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 46.589µs
	I1202 19:13:43.241506   54807 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 19:13:43.241515   54807 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241539   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 19:13:43.241543   54807 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 29.662µs
	I1202 19:13:43.241548   54807 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 19:13:43.241556   54807 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241581   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 19:13:43.241585   54807 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 29.85µs
	I1202 19:13:43.241589   54807 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 19:13:43.241615   54807 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241641   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 19:13:43.241629   54807 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241645   54807 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 32.345µs
	I1202 19:13:43.241650   54807 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 19:13:43.241693   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 19:13:43.241700   54807 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 86.392µs
	I1202 19:13:43.241706   54807 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 19:13:43.241720   54807 cache.go:87] Successfully saved all images to host disk.
	I1202 19:13:43.258350   54807 fix.go:112] recreateIfNeeded on functional-449836: state=Running err=<nil>
	W1202 19:13:43.258376   54807 fix.go:138] unexpected machine state, will restart: <nil>
	I1202 19:13:43.261600   54807 out.go:252] * Updating the running docker "functional-449836" container ...
	I1202 19:13:43.261627   54807 machine.go:94] provisionDockerMachine start ...
	I1202 19:13:43.261705   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.278805   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.279129   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.279134   54807 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 19:13:43.427938   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:13:43.427951   54807 ubuntu.go:182] provisioning hostname "functional-449836"
	I1202 19:13:43.428028   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.447456   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.447752   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.447759   54807 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-449836 && echo "functional-449836" | sudo tee /etc/hostname
	I1202 19:13:43.605729   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:13:43.605800   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.624976   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.625283   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.625296   54807 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-449836' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-449836/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-449836' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 19:13:43.772540   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 19:13:43.772562   54807 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 19:13:43.772595   54807 ubuntu.go:190] setting up certificates
	I1202 19:13:43.772604   54807 provision.go:84] configureAuth start
	I1202 19:13:43.772671   54807 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:13:43.790248   54807 provision.go:143] copyHostCerts
	I1202 19:13:43.790316   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 19:13:43.790328   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:13:43.790400   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 19:13:43.790504   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 19:13:43.790515   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:13:43.790538   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 19:13:43.790586   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 19:13:43.790589   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:13:43.790610   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 19:13:43.790652   54807 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.functional-449836 san=[127.0.0.1 192.168.49.2 functional-449836 localhost minikube]
	I1202 19:13:43.836362   54807 provision.go:177] copyRemoteCerts
	I1202 19:13:43.836414   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 19:13:43.836453   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.856436   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:43.960942   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 19:13:43.990337   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1202 19:13:44.010316   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 19:13:44.028611   54807 provision.go:87] duration metric: took 255.971492ms to configureAuth
	I1202 19:13:44.028629   54807 ubuntu.go:206] setting minikube options for container-runtime
	I1202 19:13:44.028821   54807 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:13:44.028827   54807 machine.go:97] duration metric: took 767.195405ms to provisionDockerMachine
	I1202 19:13:44.028833   54807 start.go:293] postStartSetup for "functional-449836" (driver="docker")
	I1202 19:13:44.028844   54807 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 19:13:44.028890   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 19:13:44.028937   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.046629   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.156467   54807 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 19:13:44.159958   54807 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 19:13:44.159979   54807 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 19:13:44.159992   54807 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 19:13:44.160053   54807 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 19:13:44.160131   54807 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 19:13:44.160205   54807 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> hosts in /etc/test/nested/copy/4435
	I1202 19:13:44.160247   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4435
	I1202 19:13:44.167846   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:13:44.185707   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts --> /etc/test/nested/copy/4435/hosts (40 bytes)
	I1202 19:13:44.203573   54807 start.go:296] duration metric: took 174.725487ms for postStartSetup
	I1202 19:13:44.203665   54807 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:13:44.203703   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.221082   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.321354   54807 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 19:13:44.325951   54807 fix.go:56] duration metric: took 1.085065634s for fixHost
	I1202 19:13:44.325966   54807 start.go:83] releasing machines lock for "functional-449836", held for 1.08509619s
	I1202 19:13:44.326041   54807 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:13:44.343136   54807 ssh_runner.go:195] Run: cat /version.json
	I1202 19:13:44.343179   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.343439   54807 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 19:13:44.343497   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.361296   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.363895   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.464126   54807 ssh_runner.go:195] Run: systemctl --version
	I1202 19:13:44.557588   54807 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1202 19:13:44.561902   54807 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 19:13:44.561962   54807 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 19:13:44.569598   54807 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1202 19:13:44.569611   54807 start.go:496] detecting cgroup driver to use...
	I1202 19:13:44.569649   54807 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 19:13:44.569710   54807 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 19:13:44.587349   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 19:13:44.609174   54807 docker.go:218] disabling cri-docker service (if available) ...
	I1202 19:13:44.609228   54807 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 19:13:44.629149   54807 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 19:13:44.643983   54807 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 19:13:44.758878   54807 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 19:13:44.879635   54807 docker.go:234] disabling docker service ...
	I1202 19:13:44.879691   54807 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 19:13:44.895449   54807 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 19:13:44.908858   54807 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 19:13:45.045971   54807 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 19:13:45.189406   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 19:13:45.215003   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 19:13:45.239052   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 19:13:45.252425   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 19:13:45.264818   54807 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 19:13:45.264881   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 19:13:45.275398   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:13:45.286201   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 19:13:45.295830   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:13:45.307108   54807 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 19:13:45.315922   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 19:13:45.325735   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 19:13:45.336853   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 19:13:45.346391   54807 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 19:13:45.354212   54807 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 19:13:45.361966   54807 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:13:45.496442   54807 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 19:13:45.617692   54807 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 19:13:45.617755   54807 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 19:13:45.622143   54807 start.go:564] Will wait 60s for crictl version
	I1202 19:13:45.622212   54807 ssh_runner.go:195] Run: which crictl
	I1202 19:13:45.626172   54807 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 19:13:45.650746   54807 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 19:13:45.650812   54807 ssh_runner.go:195] Run: containerd --version
	I1202 19:13:45.670031   54807 ssh_runner.go:195] Run: containerd --version
	I1202 19:13:45.697284   54807 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 19:13:45.700249   54807 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 19:13:45.717142   54807 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1202 19:13:45.724151   54807 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1202 19:13:45.727141   54807 kubeadm.go:884] updating cluster {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 19:13:45.727279   54807 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:13:45.727346   54807 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 19:13:45.751767   54807 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 19:13:45.751786   54807 cache_images.go:86] Images are preloaded, skipping loading
	I1202 19:13:45.751792   54807 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1202 19:13:45.751903   54807 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-449836 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 19:13:45.751976   54807 ssh_runner.go:195] Run: sudo crictl info
	I1202 19:13:45.777030   54807 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1202 19:13:45.777052   54807 cni.go:84] Creating CNI manager for ""
	I1202 19:13:45.777060   54807 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:13:45.777073   54807 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 19:13:45.777095   54807 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-449836 NodeName:functional-449836 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 19:13:45.777203   54807 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-449836"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 19:13:45.777274   54807 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 19:13:45.785000   54807 binaries.go:51] Found k8s binaries, skipping transfer
	I1202 19:13:45.785061   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 19:13:45.792592   54807 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1202 19:13:45.805336   54807 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 19:13:45.818427   54807 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1202 19:13:45.830990   54807 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1202 19:13:45.834935   54807 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:13:45.945402   54807 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:13:46.172299   54807 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836 for IP: 192.168.49.2
	I1202 19:13:46.172311   54807 certs.go:195] generating shared ca certs ...
	I1202 19:13:46.172340   54807 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:13:46.172494   54807 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 19:13:46.172550   54807 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 19:13:46.172557   54807 certs.go:257] generating profile certs ...
	I1202 19:13:46.172651   54807 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key
	I1202 19:13:46.172725   54807 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da
	I1202 19:13:46.172770   54807 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key
	I1202 19:13:46.172876   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 19:13:46.172906   54807 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 19:13:46.172913   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 19:13:46.172944   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 19:13:46.172967   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 19:13:46.172992   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 19:13:46.173034   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:13:46.174236   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 19:13:46.206005   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 19:13:46.223256   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 19:13:46.250390   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 19:13:46.270550   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1202 19:13:46.289153   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 19:13:46.307175   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 19:13:46.325652   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 19:13:46.343823   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 19:13:46.361647   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 19:13:46.379597   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 19:13:46.397750   54807 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 19:13:46.411087   54807 ssh_runner.go:195] Run: openssl version
	I1202 19:13:46.418777   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 19:13:46.427262   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.431022   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.431093   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.473995   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 19:13:46.482092   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 19:13:46.490432   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.494266   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.494320   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.535125   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 19:13:46.543277   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 19:13:46.551769   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.555743   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.555797   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.597778   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 19:13:46.605874   54807 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:13:46.609733   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1202 19:13:46.652482   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1202 19:13:46.693214   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1202 19:13:46.734654   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1202 19:13:46.775729   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1202 19:13:46.821319   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1202 19:13:46.862299   54807 kubeadm.go:401] StartCluster: {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:46.862398   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 19:13:46.862468   54807 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:13:46.891099   54807 cri.go:89] found id: ""
	I1202 19:13:46.891159   54807 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 19:13:46.898813   54807 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1202 19:13:46.898821   54807 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1202 19:13:46.898874   54807 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1202 19:13:46.906272   54807 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:46.906775   54807 kubeconfig.go:125] found "functional-449836" server: "https://192.168.49.2:8441"
	I1202 19:13:46.908038   54807 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1202 19:13:46.915724   54807 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-02 18:59:11.521818114 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-02 19:13:45.826341203 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1202 19:13:46.915744   54807 kubeadm.go:1161] stopping kube-system containers ...
	I1202 19:13:46.915757   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1202 19:13:46.915816   54807 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:13:46.943936   54807 cri.go:89] found id: ""
	I1202 19:13:46.944009   54807 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1202 19:13:46.961843   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:13:46.971074   54807 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Dec  2 19:03 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec  2 19:03 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec  2 19:03 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec  2 19:03 /etc/kubernetes/scheduler.conf
	
	I1202 19:13:46.971137   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:13:46.979452   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:13:46.987399   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:46.987454   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:13:46.994869   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:13:47.002498   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:47.002560   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:13:47.010116   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:13:47.017891   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:47.017946   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:13:47.025383   54807 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 19:13:47.033423   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:47.076377   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.395417   54807 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.319015091s)
	I1202 19:13:48.395495   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.604942   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.668399   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.712382   54807 api_server.go:52] waiting for apiserver process to appear ...
	I1202 19:13:48.712452   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:49.212900   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:49.713354   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:50.213340   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:50.713260   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:51.212621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:51.713471   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:52.213212   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:52.712687   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:53.212572   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:53.713310   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:54.212640   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:54.712595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:55.213133   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:55.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:56.212595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:56.713443   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:57.213230   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:57.713055   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:58.213071   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:58.712680   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:59.213352   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:59.712654   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:00.213647   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:00.712569   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:01.212673   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:01.713030   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:02.212581   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:02.712631   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:03.213287   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:03.712572   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:04.213500   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:04.713557   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:05.213523   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:05.713480   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:06.212772   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:06.713553   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:07.213309   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:07.712616   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:08.212729   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:08.713403   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:09.212625   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:09.713385   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:10.212662   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:10.712619   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:11.213505   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:11.712640   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:12.213396   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:12.712571   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:13.212963   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:13.713403   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:14.213457   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:14.712620   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:15.213335   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:15.713379   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:16.212612   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:16.712624   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:17.212573   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:17.713394   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:18.213294   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:18.713516   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:19.213531   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:19.713309   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:20.212591   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:20.713575   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:21.213560   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:21.713513   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:22.213219   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:22.712620   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:23.213273   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:23.713477   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:24.213364   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:24.712581   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:25.212597   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:25.713554   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:26.213205   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:26.712517   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:27.213345   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:27.712602   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:28.212602   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:28.713533   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:29.213188   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:29.713102   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:30.212626   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:30.712732   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:31.212615   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:31.713473   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:32.212590   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:32.712645   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:33.213398   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:33.713081   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:34.213498   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:34.712625   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:35.213560   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:35.712634   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:36.213370   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:36.712576   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:37.213006   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:37.712656   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:38.212594   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:38.713448   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:39.213442   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:39.712577   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:40.212621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:40.713516   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:41.212756   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:41.712509   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:42.215715   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:42.712573   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:43.212604   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:43.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:44.213283   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:44.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:45.213407   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:45.712947   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:46.213239   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:46.712626   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:47.213260   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:47.713210   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:48.212639   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:48.713264   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:48.713347   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:48.742977   54807 cri.go:89] found id: ""
	I1202 19:14:48.742990   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.742997   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:48.743002   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:48.743061   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:48.767865   54807 cri.go:89] found id: ""
	I1202 19:14:48.767879   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.767886   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:48.767892   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:48.767949   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:48.792531   54807 cri.go:89] found id: ""
	I1202 19:14:48.792544   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.792560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:48.792566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:48.792624   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:48.821644   54807 cri.go:89] found id: ""
	I1202 19:14:48.821657   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.821665   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:48.821670   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:48.821729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:48.847227   54807 cri.go:89] found id: ""
	I1202 19:14:48.847246   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.847253   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:48.847258   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:48.847318   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:48.872064   54807 cri.go:89] found id: ""
	I1202 19:14:48.872084   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.872091   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:48.872097   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:48.872155   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:48.895905   54807 cri.go:89] found id: ""
	I1202 19:14:48.895919   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.895925   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:48.895933   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:48.895945   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:48.962492   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:48.954400   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.954981   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.956769   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.957381   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.959046   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:48.954400   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.954981   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.956769   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.957381   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.959046   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:48.962515   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:48.962526   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:49.026861   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:49.026881   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:49.059991   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:49.060006   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:49.119340   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:49.119357   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:51.632315   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:51.642501   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:51.642560   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:51.669041   54807 cri.go:89] found id: ""
	I1202 19:14:51.669054   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.669061   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:51.669086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:51.669150   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:51.698828   54807 cri.go:89] found id: ""
	I1202 19:14:51.698857   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.698864   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:51.698870   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:51.698939   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:51.739419   54807 cri.go:89] found id: ""
	I1202 19:14:51.739446   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.739454   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:51.739459   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:51.739532   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:51.764613   54807 cri.go:89] found id: ""
	I1202 19:14:51.764627   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.764633   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:51.764639   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:51.764698   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:51.790197   54807 cri.go:89] found id: ""
	I1202 19:14:51.790211   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.790217   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:51.790222   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:51.790281   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:51.824131   54807 cri.go:89] found id: ""
	I1202 19:14:51.824144   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.824151   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:51.824170   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:51.824228   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:51.848893   54807 cri.go:89] found id: ""
	I1202 19:14:51.848907   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.848914   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:51.848922   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:51.848932   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:51.877099   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:51.877114   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:51.933539   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:51.933560   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:51.944309   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:51.944346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:52.014156   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:52.005976   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.006753   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.008549   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.009096   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.010706   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:52.005976   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.006753   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.008549   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.009096   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.010706   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:52.014167   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:52.014178   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:54.578451   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:54.588802   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:54.588862   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:54.613620   54807 cri.go:89] found id: ""
	I1202 19:14:54.613633   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.613640   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:54.613646   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:54.613704   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:54.637471   54807 cri.go:89] found id: ""
	I1202 19:14:54.637486   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.637498   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:54.637503   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:54.637561   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:54.662053   54807 cri.go:89] found id: ""
	I1202 19:14:54.662066   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.662073   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:54.662079   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:54.662135   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:54.694901   54807 cri.go:89] found id: ""
	I1202 19:14:54.694916   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.694923   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:54.694928   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:54.694998   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:54.728487   54807 cri.go:89] found id: ""
	I1202 19:14:54.728500   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.728507   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:54.728512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:54.728569   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:54.756786   54807 cri.go:89] found id: ""
	I1202 19:14:54.756800   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.756806   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:54.756812   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:54.756868   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:54.782187   54807 cri.go:89] found id: ""
	I1202 19:14:54.782200   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.782212   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:54.782220   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:54.782231   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:54.846497   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:54.838849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.839509   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.840990   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.841320   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.842849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:54.838849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.839509   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.840990   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.841320   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.842849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:54.846510   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:54.846521   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:54.909600   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:54.909620   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:54.943132   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:54.943150   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:55.006561   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:55.006581   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:57.519164   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:57.529445   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:57.529506   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:57.554155   54807 cri.go:89] found id: ""
	I1202 19:14:57.554168   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.554176   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:57.554181   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:57.554240   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:57.579453   54807 cri.go:89] found id: ""
	I1202 19:14:57.579468   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.579474   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:57.579480   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:57.579537   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:57.608139   54807 cri.go:89] found id: ""
	I1202 19:14:57.608152   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.608160   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:57.608165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:57.608224   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:57.632309   54807 cri.go:89] found id: ""
	I1202 19:14:57.632360   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.632368   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:57.632374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:57.632434   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:57.657933   54807 cri.go:89] found id: ""
	I1202 19:14:57.657947   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.657954   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:57.657959   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:57.658019   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:57.698982   54807 cri.go:89] found id: ""
	I1202 19:14:57.698996   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.699002   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:57.699008   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:57.699105   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:57.738205   54807 cri.go:89] found id: ""
	I1202 19:14:57.738219   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.738226   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:57.738234   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:57.738245   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:57.802193   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:57.794304   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.794844   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796409   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796881   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.798306   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:57.794304   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.794844   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796409   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796881   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.798306   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:57.802204   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:57.802215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:57.865638   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:57.865657   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:57.900835   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:57.900850   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:57.958121   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:57.958139   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:00.502580   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:00.515602   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:00.515692   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:00.553262   54807 cri.go:89] found id: ""
	I1202 19:15:00.553290   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.553298   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:00.553304   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:00.553372   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:00.592663   54807 cri.go:89] found id: ""
	I1202 19:15:00.592678   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.592686   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:00.592691   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:00.592782   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:00.624403   54807 cri.go:89] found id: ""
	I1202 19:15:00.624423   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.624431   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:00.624438   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:00.624521   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:00.659265   54807 cri.go:89] found id: ""
	I1202 19:15:00.659280   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.659288   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:00.659294   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:00.659383   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:00.695489   54807 cri.go:89] found id: ""
	I1202 19:15:00.695508   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.695517   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:00.695523   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:00.695592   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:00.732577   54807 cri.go:89] found id: ""
	I1202 19:15:00.732592   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.732600   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:00.732607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:00.732696   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:00.767521   54807 cri.go:89] found id: ""
	I1202 19:15:00.767538   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.767546   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:00.767555   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:00.767566   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:00.829818   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:00.829837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:00.842792   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:00.842810   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:00.919161   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:00.909398   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.910484   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.912564   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.913381   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.915298   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:00.909398   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.910484   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.912564   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.913381   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.915298   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:00.919174   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:00.919193   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:00.985798   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:00.985819   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:03.521258   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:03.531745   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:03.531810   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:03.556245   54807 cri.go:89] found id: ""
	I1202 19:15:03.556258   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.556265   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:03.556271   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:03.556355   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:03.580774   54807 cri.go:89] found id: ""
	I1202 19:15:03.580787   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.580794   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:03.580799   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:03.580857   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:03.606247   54807 cri.go:89] found id: ""
	I1202 19:15:03.606261   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.606269   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:03.606274   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:03.606335   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:03.631169   54807 cri.go:89] found id: ""
	I1202 19:15:03.631182   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.631189   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:03.631195   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:03.631252   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:03.657089   54807 cri.go:89] found id: ""
	I1202 19:15:03.657111   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.657118   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:03.657124   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:03.657183   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:03.699997   54807 cri.go:89] found id: ""
	I1202 19:15:03.700010   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.700017   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:03.700023   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:03.700081   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:03.725717   54807 cri.go:89] found id: ""
	I1202 19:15:03.725731   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.725738   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:03.725746   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:03.725755   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:03.793907   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:03.793928   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:03.822178   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:03.822199   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:03.881429   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:03.881453   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:03.892554   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:03.892569   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:03.960792   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:03.952836   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.953779   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955515   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955829   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.957393   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:03.952836   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.953779   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955515   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955829   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.957393   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:06.461036   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:06.471459   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:06.471519   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:06.500164   54807 cri.go:89] found id: ""
	I1202 19:15:06.500178   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.500184   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:06.500190   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:06.500253   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:06.526532   54807 cri.go:89] found id: ""
	I1202 19:15:06.526545   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.526552   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:06.526558   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:06.526616   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:06.551534   54807 cri.go:89] found id: ""
	I1202 19:15:06.551553   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.551560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:06.551566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:06.551628   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:06.577486   54807 cri.go:89] found id: ""
	I1202 19:15:06.577500   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.577506   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:06.577512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:06.577570   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:06.607506   54807 cri.go:89] found id: ""
	I1202 19:15:06.607520   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.607529   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:06.607535   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:06.607663   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:06.632779   54807 cri.go:89] found id: ""
	I1202 19:15:06.632792   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.632799   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:06.632805   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:06.632866   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:06.656916   54807 cri.go:89] found id: ""
	I1202 19:15:06.656928   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.656936   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:06.656943   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:06.656953   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:06.721178   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:06.721197   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:06.733421   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:06.733437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:06.806706   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:06.798437   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.799136   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.800799   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.801507   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.803118   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:06.798437   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.799136   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.800799   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.801507   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.803118   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:06.806717   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:06.806728   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:06.870452   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:06.870471   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:09.403297   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:09.414259   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:09.414319   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:09.442090   54807 cri.go:89] found id: ""
	I1202 19:15:09.442103   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.442110   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:09.442115   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:09.442175   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:09.471784   54807 cri.go:89] found id: ""
	I1202 19:15:09.471797   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.471804   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:09.471809   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:09.471887   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:09.496688   54807 cri.go:89] found id: ""
	I1202 19:15:09.496701   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.496708   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:09.496714   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:09.496773   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:09.522932   54807 cri.go:89] found id: ""
	I1202 19:15:09.522946   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.522952   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:09.522957   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:09.523018   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:09.550254   54807 cri.go:89] found id: ""
	I1202 19:15:09.550268   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.550275   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:09.550280   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:09.550341   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:09.578955   54807 cri.go:89] found id: ""
	I1202 19:15:09.578968   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.578975   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:09.578980   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:09.579041   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:09.603797   54807 cri.go:89] found id: ""
	I1202 19:15:09.603812   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.603819   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:09.603827   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:09.603837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:09.660195   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:09.660215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:09.671581   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:09.671596   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:09.755982   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:09.748446   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.749204   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.750900   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.751203   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.752674   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:09.748446   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.749204   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.750900   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.751203   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.752674   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:09.755993   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:09.756013   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:09.820958   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:09.820977   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:12.349982   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:12.359890   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:12.359953   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:12.387716   54807 cri.go:89] found id: ""
	I1202 19:15:12.387729   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.387736   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:12.387741   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:12.387802   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:12.413168   54807 cri.go:89] found id: ""
	I1202 19:15:12.413182   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.413188   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:12.413194   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:12.413262   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:12.441234   54807 cri.go:89] found id: ""
	I1202 19:15:12.441247   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.441253   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:12.441262   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:12.441321   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:12.465660   54807 cri.go:89] found id: ""
	I1202 19:15:12.465673   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.465680   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:12.465689   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:12.465747   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:12.489519   54807 cri.go:89] found id: ""
	I1202 19:15:12.489532   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.489540   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:12.489545   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:12.489605   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:12.514756   54807 cri.go:89] found id: ""
	I1202 19:15:12.514770   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.514777   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:12.514782   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:12.514843   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:12.538845   54807 cri.go:89] found id: ""
	I1202 19:15:12.538858   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.538865   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:12.538872   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:12.538884   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:12.549453   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:12.549477   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:12.616294   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:12.608411   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.609095   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.610814   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.611359   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.612794   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:12.608411   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.609095   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.610814   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.611359   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.612794   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:12.616304   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:12.616315   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:12.679579   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:12.679598   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:12.712483   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:12.712499   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:15.277003   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:15.287413   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:15.287496   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:15.313100   54807 cri.go:89] found id: ""
	I1202 19:15:15.313113   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.313120   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:15.313135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:15.313194   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:15.339367   54807 cri.go:89] found id: ""
	I1202 19:15:15.339381   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.339387   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:15.339393   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:15.339463   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:15.364247   54807 cri.go:89] found id: ""
	I1202 19:15:15.364270   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.364277   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:15.364283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:15.364393   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:15.389379   54807 cri.go:89] found id: ""
	I1202 19:15:15.389393   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.389401   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:15.389412   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:15.389472   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:15.414364   54807 cri.go:89] found id: ""
	I1202 19:15:15.414378   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.414386   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:15.414391   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:15.414455   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:15.438995   54807 cri.go:89] found id: ""
	I1202 19:15:15.439009   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.439024   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:15.439030   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:15.439097   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:15.467973   54807 cri.go:89] found id: ""
	I1202 19:15:15.467986   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.467993   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:15.468001   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:15.468010   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:15.534212   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:15.526543   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.527142   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.528724   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.529277   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.530859   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:15.526543   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.527142   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.528724   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.529277   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.530859   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:15.534222   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:15.534233   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:15.602898   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:15.602917   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:15.634225   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:15.634242   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:15.693229   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:15.693247   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:18.205585   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:18.217019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:18.217080   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:18.243139   54807 cri.go:89] found id: ""
	I1202 19:15:18.243153   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.243160   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:18.243176   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:18.243234   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:18.266826   54807 cri.go:89] found id: ""
	I1202 19:15:18.266839   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.266846   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:18.266851   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:18.266911   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:18.291760   54807 cri.go:89] found id: ""
	I1202 19:15:18.291773   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.291781   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:18.291795   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:18.291853   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:18.315881   54807 cri.go:89] found id: ""
	I1202 19:15:18.315895   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.315902   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:18.315907   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:18.315963   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:18.354620   54807 cri.go:89] found id: ""
	I1202 19:15:18.354633   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.354640   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:18.354649   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:18.354708   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:18.378919   54807 cri.go:89] found id: ""
	I1202 19:15:18.378932   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.378939   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:18.378945   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:18.379003   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:18.403461   54807 cri.go:89] found id: ""
	I1202 19:15:18.403474   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.403482   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:18.403489   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:18.403499   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:18.460043   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:18.460062   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:18.471326   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:18.471343   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:18.533325   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:18.525005   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.525603   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527360   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527971   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.529742   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:18.525005   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.525603   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527360   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527971   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.529742   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:18.533335   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:18.533346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:18.595843   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:18.595862   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:21.128472   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:21.138623   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:21.138683   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:21.163008   54807 cri.go:89] found id: ""
	I1202 19:15:21.163021   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.163028   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:21.163039   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:21.163096   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:21.186917   54807 cri.go:89] found id: ""
	I1202 19:15:21.186930   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.186937   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:21.186942   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:21.187000   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:21.212853   54807 cri.go:89] found id: ""
	I1202 19:15:21.212866   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.212873   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:21.212878   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:21.212937   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:21.240682   54807 cri.go:89] found id: ""
	I1202 19:15:21.240695   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.240703   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:21.240708   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:21.240765   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:21.264693   54807 cri.go:89] found id: ""
	I1202 19:15:21.264706   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.264713   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:21.264718   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:21.264778   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:21.288193   54807 cri.go:89] found id: ""
	I1202 19:15:21.288207   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.288214   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:21.288219   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:21.288278   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:21.313950   54807 cri.go:89] found id: ""
	I1202 19:15:21.313964   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.313971   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:21.313979   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:21.313990   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:21.324612   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:21.324626   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:21.388157   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:21.380313   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.381141   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.382761   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.383081   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.384783   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:21.380313   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.381141   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.382761   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.383081   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.384783   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:21.388177   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:21.388188   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:21.451835   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:21.451853   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:21.480172   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:21.480187   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:24.037107   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:24.047291   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:24.047362   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:24.072397   54807 cri.go:89] found id: ""
	I1202 19:15:24.072411   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.072418   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:24.072424   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:24.072486   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:24.097793   54807 cri.go:89] found id: ""
	I1202 19:15:24.097807   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.097814   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:24.097819   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:24.097879   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:24.122934   54807 cri.go:89] found id: ""
	I1202 19:15:24.122947   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.122954   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:24.122960   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:24.123020   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:24.147849   54807 cri.go:89] found id: ""
	I1202 19:15:24.147863   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.147869   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:24.147875   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:24.147935   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:24.172919   54807 cri.go:89] found id: ""
	I1202 19:15:24.172932   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.172939   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:24.172944   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:24.173004   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:24.197266   54807 cri.go:89] found id: ""
	I1202 19:15:24.197280   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.197287   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:24.197293   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:24.197351   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:24.222541   54807 cri.go:89] found id: ""
	I1202 19:15:24.222555   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.222562   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:24.222572   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:24.222582   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:24.278762   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:24.278784   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:24.289861   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:24.289877   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:24.353810   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:24.345889   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.346412   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.347992   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.348533   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.350299   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:24.345889   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.346412   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.347992   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.348533   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.350299   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:24.353831   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:24.353842   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:24.416010   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:24.416029   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:26.947462   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:26.958975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:26.959033   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:26.992232   54807 cri.go:89] found id: ""
	I1202 19:15:26.992257   54807 logs.go:282] 0 containers: []
	W1202 19:15:26.992264   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:26.992270   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:26.992354   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:27.021036   54807 cri.go:89] found id: ""
	I1202 19:15:27.021049   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.021056   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:27.021062   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:27.021119   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:27.052008   54807 cri.go:89] found id: ""
	I1202 19:15:27.052022   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.052028   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:27.052034   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:27.052093   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:27.076184   54807 cri.go:89] found id: ""
	I1202 19:15:27.076197   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.076204   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:27.076209   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:27.076266   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:27.100296   54807 cri.go:89] found id: ""
	I1202 19:15:27.100308   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.100315   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:27.100355   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:27.100413   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:27.125762   54807 cri.go:89] found id: ""
	I1202 19:15:27.125776   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.125783   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:27.125788   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:27.125851   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:27.150224   54807 cri.go:89] found id: ""
	I1202 19:15:27.150237   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.150244   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:27.150252   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:27.150262   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:27.178321   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:27.178338   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:27.233465   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:27.233484   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:27.244423   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:27.244437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:27.311220   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:27.303476   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.303903   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.305730   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.306227   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.307716   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:27.303476   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.303903   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.305730   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.306227   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.307716   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:27.311235   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:27.311246   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:29.874091   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:29.884341   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:29.884402   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:29.909943   54807 cri.go:89] found id: ""
	I1202 19:15:29.909962   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.909970   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:29.909975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:29.910035   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:29.947534   54807 cri.go:89] found id: ""
	I1202 19:15:29.947547   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.947554   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:29.947559   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:29.947617   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:29.989319   54807 cri.go:89] found id: ""
	I1202 19:15:29.989335   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.989343   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:29.989349   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:29.989414   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:30.038828   54807 cri.go:89] found id: ""
	I1202 19:15:30.038842   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.038850   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:30.038856   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:30.038932   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:30.067416   54807 cri.go:89] found id: ""
	I1202 19:15:30.067432   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.067440   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:30.067446   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:30.067509   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:30.094866   54807 cri.go:89] found id: ""
	I1202 19:15:30.094881   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.094888   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:30.094896   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:30.094958   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:30.120930   54807 cri.go:89] found id: ""
	I1202 19:15:30.120959   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.120968   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:30.120977   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:30.120988   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:30.177165   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:30.177186   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:30.188251   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:30.188267   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:30.255176   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:30.247015   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.247757   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.249382   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.250104   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.251678   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:30.247015   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.247757   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.249382   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.250104   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.251678   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:30.255194   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:30.255205   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:30.323165   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:30.323189   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:32.854201   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:32.864404   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:32.864467   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:32.890146   54807 cri.go:89] found id: ""
	I1202 19:15:32.890160   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.890166   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:32.890172   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:32.890239   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:32.915189   54807 cri.go:89] found id: ""
	I1202 19:15:32.915202   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.915210   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:32.915215   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:32.915286   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:32.952949   54807 cri.go:89] found id: ""
	I1202 19:15:32.952962   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.952969   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:32.952975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:32.953031   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:32.986345   54807 cri.go:89] found id: ""
	I1202 19:15:32.986359   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.986366   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:32.986371   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:32.986435   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:33.010880   54807 cri.go:89] found id: ""
	I1202 19:15:33.010894   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.010902   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:33.010908   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:33.010966   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:33.039327   54807 cri.go:89] found id: ""
	I1202 19:15:33.039341   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.039348   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:33.039354   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:33.039412   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:33.064437   54807 cri.go:89] found id: ""
	I1202 19:15:33.064463   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.064470   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:33.064478   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:33.064488   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:33.120755   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:33.120773   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:33.132552   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:33.132575   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:33.199378   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:33.191204   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.191873   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.193517   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.194121   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.195812   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:33.191204   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.191873   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.193517   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.194121   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.195812   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:33.199389   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:33.199401   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:33.266899   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:33.266918   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:35.796024   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:35.807086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:35.807146   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:35.839365   54807 cri.go:89] found id: ""
	I1202 19:15:35.839378   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.839394   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:35.839400   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:35.839469   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:35.872371   54807 cri.go:89] found id: ""
	I1202 19:15:35.872385   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.872393   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:35.872398   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:35.872467   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:35.901242   54807 cri.go:89] found id: ""
	I1202 19:15:35.901255   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.901262   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:35.901268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:35.901326   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:35.936195   54807 cri.go:89] found id: ""
	I1202 19:15:35.936209   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.936215   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:35.936221   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:35.936282   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:35.965129   54807 cri.go:89] found id: ""
	I1202 19:15:35.965145   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.965153   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:35.965159   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:35.966675   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:35.998286   54807 cri.go:89] found id: ""
	I1202 19:15:35.998299   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.998306   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:35.998311   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:35.998371   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:36.024787   54807 cri.go:89] found id: ""
	I1202 19:15:36.024800   54807 logs.go:282] 0 containers: []
	W1202 19:15:36.024812   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:36.024820   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:36.024829   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:36.081130   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:36.081146   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:36.092692   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:36.092714   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:36.154814   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:36.146918   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.147704   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149430   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149884   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.151372   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:36.146918   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.147704   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149430   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149884   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.151372   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:36.154824   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:36.154837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:36.218034   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:36.218052   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:38.748085   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:38.758270   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:38.758328   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:38.786304   54807 cri.go:89] found id: ""
	I1202 19:15:38.786317   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.786325   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:38.786330   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:38.786389   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:38.811113   54807 cri.go:89] found id: ""
	I1202 19:15:38.811126   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.811134   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:38.811139   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:38.811223   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:38.836191   54807 cri.go:89] found id: ""
	I1202 19:15:38.836207   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.836214   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:38.836219   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:38.836278   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:38.860383   54807 cri.go:89] found id: ""
	I1202 19:15:38.860396   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.860403   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:38.860410   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:38.860469   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:38.887750   54807 cri.go:89] found id: ""
	I1202 19:15:38.887764   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.887770   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:38.887775   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:38.887834   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:38.914103   54807 cri.go:89] found id: ""
	I1202 19:15:38.914116   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.914123   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:38.914128   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:38.914184   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:38.950405   54807 cri.go:89] found id: ""
	I1202 19:15:38.950418   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.950425   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:38.950433   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:38.950442   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:39.016206   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:39.016225   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:39.026699   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:39.026714   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:39.090183   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:39.082441   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.083071   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.084892   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.085258   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.086741   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:39.082441   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.083071   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.084892   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.085258   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.086741   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:39.090195   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:39.090206   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:39.151533   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:39.151551   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:41.681058   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:41.691353   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:41.691417   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:41.716684   54807 cri.go:89] found id: ""
	I1202 19:15:41.716697   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.716704   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:41.716710   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:41.716768   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:41.742096   54807 cri.go:89] found id: ""
	I1202 19:15:41.742110   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.742117   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:41.742122   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:41.742182   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:41.766652   54807 cri.go:89] found id: ""
	I1202 19:15:41.766665   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.766672   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:41.766678   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:41.766741   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:41.791517   54807 cri.go:89] found id: ""
	I1202 19:15:41.791531   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.791538   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:41.791544   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:41.791600   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:41.817700   54807 cri.go:89] found id: ""
	I1202 19:15:41.817713   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.817720   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:41.817725   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:41.817786   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:41.846078   54807 cri.go:89] found id: ""
	I1202 19:15:41.846092   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.846099   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:41.846104   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:41.846161   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:41.874235   54807 cri.go:89] found id: ""
	I1202 19:15:41.874249   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.874258   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:41.874268   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:41.874278   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:41.942286   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:41.942307   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:41.989723   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:41.989740   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:42.047707   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:42.047728   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:42.061053   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:42.061073   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:42.138885   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:42.129369   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.130125   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.131195   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.132151   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.133038   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:42.129369   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.130125   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.131195   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.132151   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.133038   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:44.639103   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:44.648984   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:44.649044   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:44.673076   54807 cri.go:89] found id: ""
	I1202 19:15:44.673091   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.673098   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:44.673105   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:44.673162   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:44.696488   54807 cri.go:89] found id: ""
	I1202 19:15:44.696501   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.696507   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:44.696512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:44.696568   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:44.722164   54807 cri.go:89] found id: ""
	I1202 19:15:44.722177   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.722184   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:44.722190   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:44.722254   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:44.745410   54807 cri.go:89] found id: ""
	I1202 19:15:44.745424   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.745431   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:44.745437   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:44.745494   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:44.769317   54807 cri.go:89] found id: ""
	I1202 19:15:44.769330   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.769337   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:44.769342   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:44.769404   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:44.794282   54807 cri.go:89] found id: ""
	I1202 19:15:44.794295   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.794302   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:44.794308   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:44.794369   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:44.818676   54807 cri.go:89] found id: ""
	I1202 19:15:44.818689   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.818696   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:44.818703   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:44.818734   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:44.829491   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:44.829506   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:44.892401   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:44.884881   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.885617   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887285   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887590   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.889113   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:44.884881   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.885617   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887285   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887590   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.889113   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:44.892427   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:44.892438   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:44.961436   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:44.961457   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:45.004301   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:45.004340   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:47.597359   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:47.607380   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:47.607436   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:47.632361   54807 cri.go:89] found id: ""
	I1202 19:15:47.632375   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.632382   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:47.632387   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:47.632443   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:47.657478   54807 cri.go:89] found id: ""
	I1202 19:15:47.657491   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.657498   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:47.657504   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:47.657565   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:47.681973   54807 cri.go:89] found id: ""
	I1202 19:15:47.681987   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.681994   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:47.681999   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:47.682054   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:47.705968   54807 cri.go:89] found id: ""
	I1202 19:15:47.705982   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.705988   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:47.705994   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:47.706051   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:47.730910   54807 cri.go:89] found id: ""
	I1202 19:15:47.730923   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.730930   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:47.730935   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:47.730992   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:47.757739   54807 cri.go:89] found id: ""
	I1202 19:15:47.757752   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.757759   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:47.757764   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:47.757820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:47.782566   54807 cri.go:89] found id: ""
	I1202 19:15:47.782579   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.782586   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:47.782594   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:47.782605   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:47.845974   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:47.837753   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.838402   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840192   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840777   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.842546   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:47.837753   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.838402   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840192   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840777   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.842546   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:47.845983   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:47.845994   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:47.913035   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:47.913054   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:47.952076   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:47.952091   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:48.023577   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:48.023596   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:50.534902   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:50.544843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:50.544904   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:50.573435   54807 cri.go:89] found id: ""
	I1202 19:15:50.573449   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.573456   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:50.573462   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:50.573524   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:50.598029   54807 cri.go:89] found id: ""
	I1202 19:15:50.598043   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.598051   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:50.598056   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:50.598115   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:50.623452   54807 cri.go:89] found id: ""
	I1202 19:15:50.623465   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.623472   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:50.623478   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:50.623536   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:50.648357   54807 cri.go:89] found id: ""
	I1202 19:15:50.648371   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.648378   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:50.648383   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:50.648441   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:50.672042   54807 cri.go:89] found id: ""
	I1202 19:15:50.672056   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.672063   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:50.672068   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:50.672125   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:50.697434   54807 cri.go:89] found id: ""
	I1202 19:15:50.697448   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.697455   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:50.697461   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:50.697525   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:50.728291   54807 cri.go:89] found id: ""
	I1202 19:15:50.728305   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.728312   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:50.728340   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:50.728351   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:50.790193   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:50.781764   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.782494   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784122   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784535   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.786186   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:50.781764   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.782494   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784122   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784535   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.786186   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:50.790203   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:50.790214   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:50.855933   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:50.855951   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:50.884682   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:50.884698   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:50.949404   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:50.949423   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:53.461440   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:53.471831   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:53.471906   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:53.496591   54807 cri.go:89] found id: ""
	I1202 19:15:53.496604   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.496611   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:53.496617   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:53.496674   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:53.521087   54807 cri.go:89] found id: ""
	I1202 19:15:53.521103   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.521111   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:53.521116   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:53.521174   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:53.545148   54807 cri.go:89] found id: ""
	I1202 19:15:53.545161   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.545168   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:53.545173   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:53.545231   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:53.570884   54807 cri.go:89] found id: ""
	I1202 19:15:53.570898   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.570904   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:53.570910   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:53.570972   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:53.597220   54807 cri.go:89] found id: ""
	I1202 19:15:53.597234   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.597241   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:53.597247   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:53.597326   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:53.626817   54807 cri.go:89] found id: ""
	I1202 19:15:53.626830   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.626837   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:53.626843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:53.626901   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:53.656721   54807 cri.go:89] found id: ""
	I1202 19:15:53.656734   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.656741   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:53.656750   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:53.656762   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:53.721841   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:53.712824   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.713681   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715369   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715912   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.717503   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:53.712824   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.713681   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715369   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715912   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.717503   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:53.721850   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:53.721862   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:53.785783   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:53.785801   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:53.815658   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:53.815673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:53.873221   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:53.873238   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:56.384447   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:56.394843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:56.394909   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:56.425129   54807 cri.go:89] found id: ""
	I1202 19:15:56.425142   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.425149   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:56.425154   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:56.425212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:56.451236   54807 cri.go:89] found id: ""
	I1202 19:15:56.451250   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.451257   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:56.451263   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:56.451327   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:56.476585   54807 cri.go:89] found id: ""
	I1202 19:15:56.476599   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.476606   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:56.476611   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:56.476669   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:56.501814   54807 cri.go:89] found id: ""
	I1202 19:15:56.501828   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.501834   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:56.501840   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:56.501900   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:56.530866   54807 cri.go:89] found id: ""
	I1202 19:15:56.530879   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.530886   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:56.530891   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:56.530959   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:56.555014   54807 cri.go:89] found id: ""
	I1202 19:15:56.555029   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.555036   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:56.555042   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:56.555102   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:56.582644   54807 cri.go:89] found id: ""
	I1202 19:15:56.582657   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.582664   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:56.582672   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:56.582684   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:56.637937   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:56.637955   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:56.648656   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:56.648672   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:56.716929   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:56.708385   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.708981   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.710802   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.711377   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.712990   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:56.708385   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.708981   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.710802   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.711377   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.712990   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:56.716939   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:56.716950   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:56.783854   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:56.783880   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:59.312498   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:59.322671   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:59.322730   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:59.346425   54807 cri.go:89] found id: ""
	I1202 19:15:59.346439   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.346446   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:59.346452   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:59.346515   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:59.371199   54807 cri.go:89] found id: ""
	I1202 19:15:59.371212   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.371219   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:59.371224   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:59.371286   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:59.398444   54807 cri.go:89] found id: ""
	I1202 19:15:59.398458   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.398465   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:59.398470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:59.398528   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:59.423109   54807 cri.go:89] found id: ""
	I1202 19:15:59.423122   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.423129   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:59.423135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:59.423193   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:59.448440   54807 cri.go:89] found id: ""
	I1202 19:15:59.448454   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.448461   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:59.448469   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:59.448539   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:59.472288   54807 cri.go:89] found id: ""
	I1202 19:15:59.472302   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.472309   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:59.472315   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:59.472396   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:59.501959   54807 cri.go:89] found id: ""
	I1202 19:15:59.501973   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.501980   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:59.501987   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:59.501999   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:59.562783   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:59.555099   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.555718   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557259   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557814   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.559465   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:59.555099   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.555718   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557259   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557814   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.559465   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:59.562800   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:59.562811   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:59.626612   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:59.626631   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:59.655068   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:59.655083   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:59.713332   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:59.713350   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:02.224451   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:02.234704   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:02.234765   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:02.260861   54807 cri.go:89] found id: ""
	I1202 19:16:02.260875   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.260882   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:02.260888   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:02.260951   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:02.286334   54807 cri.go:89] found id: ""
	I1202 19:16:02.286354   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.286362   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:02.286367   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:02.286426   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:02.310961   54807 cri.go:89] found id: ""
	I1202 19:16:02.310975   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.310982   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:02.310988   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:02.311050   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:02.339645   54807 cri.go:89] found id: ""
	I1202 19:16:02.339658   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.339665   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:02.339670   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:02.339727   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:02.364456   54807 cri.go:89] found id: ""
	I1202 19:16:02.364471   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.364478   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:02.364484   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:02.364547   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:02.394258   54807 cri.go:89] found id: ""
	I1202 19:16:02.394272   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.394278   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:02.394284   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:02.394342   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:02.418723   54807 cri.go:89] found id: ""
	I1202 19:16:02.418737   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.418744   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:02.418752   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:02.418762   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:02.482679   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:02.474517   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.475378   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.476933   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.477486   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.479002   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:02.474517   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.475378   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.476933   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.477486   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.479002   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:02.482690   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:02.482700   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:02.548276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:02.548295   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:02.578369   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:02.578386   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:02.636563   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:02.636581   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:05.147857   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:05.158273   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:05.158332   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:05.198133   54807 cri.go:89] found id: ""
	I1202 19:16:05.198149   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.198161   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:05.198167   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:05.198230   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:05.229481   54807 cri.go:89] found id: ""
	I1202 19:16:05.229494   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.229508   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:05.229513   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:05.229573   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:05.255940   54807 cri.go:89] found id: ""
	I1202 19:16:05.255954   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.255961   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:05.255967   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:05.256027   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:05.281978   54807 cri.go:89] found id: ""
	I1202 19:16:05.281991   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.281998   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:05.282004   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:05.282063   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:05.310511   54807 cri.go:89] found id: ""
	I1202 19:16:05.310525   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.310533   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:05.310539   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:05.310605   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:05.340114   54807 cri.go:89] found id: ""
	I1202 19:16:05.340127   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.340135   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:05.340140   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:05.340198   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:05.366243   54807 cri.go:89] found id: ""
	I1202 19:16:05.366256   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.366263   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:05.366271   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:05.366283   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:05.393993   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:05.394009   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:05.450279   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:05.450299   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:05.461585   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:05.461602   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:05.528601   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:05.520245   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.521019   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.522739   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.523260   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.524914   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:05.520245   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.521019   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.522739   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.523260   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.524914   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:05.528610   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:05.528621   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:08.097252   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:08.107731   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:08.107792   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:08.134215   54807 cri.go:89] found id: ""
	I1202 19:16:08.134240   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.134248   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:08.134255   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:08.134327   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:08.160174   54807 cri.go:89] found id: ""
	I1202 19:16:08.160188   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.160195   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:08.160200   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:08.160259   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:08.188835   54807 cri.go:89] found id: ""
	I1202 19:16:08.188849   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.188856   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:08.188871   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:08.188930   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:08.222672   54807 cri.go:89] found id: ""
	I1202 19:16:08.222686   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.222703   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:08.222710   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:08.222774   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:08.252685   54807 cri.go:89] found id: ""
	I1202 19:16:08.252699   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.252705   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:08.252711   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:08.252767   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:08.281659   54807 cri.go:89] found id: ""
	I1202 19:16:08.281672   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.281679   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:08.281685   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:08.281757   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:08.306909   54807 cri.go:89] found id: ""
	I1202 19:16:08.306922   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.306929   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:08.306936   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:08.306947   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:08.363919   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:08.363938   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:08.375138   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:08.375154   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:08.443392   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:08.435786   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.436517   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438269   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438769   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.439921   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:08.435786   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.436517   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438269   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438769   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.439921   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:08.443414   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:08.443428   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:08.507474   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:08.507492   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:11.037665   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:11.050056   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:11.050130   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:11.076993   54807 cri.go:89] found id: ""
	I1202 19:16:11.077008   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.077015   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:11.077021   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:11.077088   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:11.104370   54807 cri.go:89] found id: ""
	I1202 19:16:11.104384   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.104393   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:11.104399   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:11.104463   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:11.132145   54807 cri.go:89] found id: ""
	I1202 19:16:11.132160   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.132168   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:11.132174   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:11.132235   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:11.158847   54807 cri.go:89] found id: ""
	I1202 19:16:11.158861   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.158868   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:11.158874   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:11.158934   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:11.198715   54807 cri.go:89] found id: ""
	I1202 19:16:11.198729   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.198736   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:11.198741   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:11.198804   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:11.230867   54807 cri.go:89] found id: ""
	I1202 19:16:11.230886   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.230893   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:11.230899   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:11.230957   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:11.259807   54807 cri.go:89] found id: ""
	I1202 19:16:11.259821   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.259828   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:11.259836   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:11.259846   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:11.287151   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:11.287167   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:11.344009   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:11.344032   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:11.354412   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:11.354433   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:11.420896   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:11.412603   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.413632   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415146   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415437   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.416861   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:11.412603   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.413632   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415146   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415437   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.416861   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:11.420906   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:11.420917   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:13.984421   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:13.995238   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:13.995302   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:14.021325   54807 cri.go:89] found id: ""
	I1202 19:16:14.021338   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.021345   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:14.021350   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:14.021407   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:14.047264   54807 cri.go:89] found id: ""
	I1202 19:16:14.047278   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.047285   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:14.047291   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:14.047355   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:14.071231   54807 cri.go:89] found id: ""
	I1202 19:16:14.071245   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.071252   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:14.071257   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:14.071315   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:14.096289   54807 cri.go:89] found id: ""
	I1202 19:16:14.096302   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.096309   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:14.096315   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:14.096397   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:14.122522   54807 cri.go:89] found id: ""
	I1202 19:16:14.122535   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.122542   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:14.122548   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:14.122608   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:14.151408   54807 cri.go:89] found id: ""
	I1202 19:16:14.151422   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.151429   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:14.151435   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:14.151496   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:14.182327   54807 cri.go:89] found id: ""
	I1202 19:16:14.182340   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.182347   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:14.182355   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:14.182365   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:14.246777   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:14.246796   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:14.262093   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:14.262108   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:14.326058   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:14.317581   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.318458   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320176   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320802   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.322292   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:14.317581   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.318458   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320176   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320802   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.322292   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:14.326068   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:14.326080   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:14.388559   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:14.388578   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:16.920108   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:16.930319   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:16.930382   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:16.955799   54807 cri.go:89] found id: ""
	I1202 19:16:16.955813   54807 logs.go:282] 0 containers: []
	W1202 19:16:16.955820   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:16.955825   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:16.955882   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:16.982139   54807 cri.go:89] found id: ""
	I1202 19:16:16.982153   54807 logs.go:282] 0 containers: []
	W1202 19:16:16.982160   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:16.982165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:16.982223   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:17.007837   54807 cri.go:89] found id: ""
	I1202 19:16:17.007851   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.007857   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:17.007863   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:17.007933   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:17.034216   54807 cri.go:89] found id: ""
	I1202 19:16:17.034229   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.034236   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:17.034241   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:17.034298   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:17.063913   54807 cri.go:89] found id: ""
	I1202 19:16:17.063927   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.063934   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:17.063939   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:17.063997   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:17.088826   54807 cri.go:89] found id: ""
	I1202 19:16:17.088840   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.088847   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:17.088853   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:17.088913   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:17.114356   54807 cri.go:89] found id: ""
	I1202 19:16:17.114370   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.114376   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:17.114384   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:17.114394   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:17.171571   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:17.171591   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:17.192662   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:17.192677   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:17.265860   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:17.257716   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.258547   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260144   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260824   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.262474   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:17.257716   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.258547   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260144   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260824   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.262474   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:17.265870   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:17.265883   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:17.329636   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:17.329654   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:19.857139   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:19.867414   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:19.867471   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:19.891736   54807 cri.go:89] found id: ""
	I1202 19:16:19.891750   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.891757   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:19.891762   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:19.891819   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:19.916840   54807 cri.go:89] found id: ""
	I1202 19:16:19.916854   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.916861   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:19.916881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:19.916938   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:19.941623   54807 cri.go:89] found id: ""
	I1202 19:16:19.941636   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.941643   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:19.941649   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:19.941706   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:19.973037   54807 cri.go:89] found id: ""
	I1202 19:16:19.973051   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.973059   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:19.973065   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:19.973134   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:20.000748   54807 cri.go:89] found id: ""
	I1202 19:16:20.000765   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.000773   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:20.000780   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:20.000851   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:20.025854   54807 cri.go:89] found id: ""
	I1202 19:16:20.025868   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.025875   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:20.025881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:20.025940   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:20.052281   54807 cri.go:89] found id: ""
	I1202 19:16:20.052296   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.052304   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:20.052312   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:20.052346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:20.120511   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:20.111945   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.112719   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.114519   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.115208   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.116979   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:20.111945   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.112719   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.114519   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.115208   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.116979   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:20.120542   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:20.120557   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:20.192068   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:20.192088   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:20.232059   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:20.232074   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:20.287505   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:20.287527   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:22.798885   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:22.808880   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:22.808947   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:22.838711   54807 cri.go:89] found id: ""
	I1202 19:16:22.838736   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.838744   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:22.838750   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:22.838815   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:22.866166   54807 cri.go:89] found id: ""
	I1202 19:16:22.866180   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.866187   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:22.866192   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:22.866250   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:22.890456   54807 cri.go:89] found id: ""
	I1202 19:16:22.890470   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.890484   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:22.890490   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:22.890554   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:22.915548   54807 cri.go:89] found id: ""
	I1202 19:16:22.915562   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.915578   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:22.915585   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:22.915643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:22.940011   54807 cri.go:89] found id: ""
	I1202 19:16:22.940025   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.940032   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:22.940037   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:22.940093   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:22.965647   54807 cri.go:89] found id: ""
	I1202 19:16:22.965660   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.965670   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:22.965677   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:22.965744   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:22.994566   54807 cri.go:89] found id: ""
	I1202 19:16:22.994580   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.994587   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:22.994595   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:22.994611   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:23.050953   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:23.050973   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:23.061610   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:23.061624   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:23.127525   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:23.119520   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.120161   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.121888   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.122530   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.124179   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:23.119520   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.120161   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.121888   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.122530   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.124179   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:23.127534   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:23.127546   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:23.194603   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:23.194639   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:25.725656   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:25.735521   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:25.735580   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:25.760626   54807 cri.go:89] found id: ""
	I1202 19:16:25.760640   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.760647   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:25.760652   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:25.760711   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:25.786443   54807 cri.go:89] found id: ""
	I1202 19:16:25.786457   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.786464   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:25.786470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:25.786529   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:25.813975   54807 cri.go:89] found id: ""
	I1202 19:16:25.813989   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.813996   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:25.814001   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:25.814059   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:25.839899   54807 cri.go:89] found id: ""
	I1202 19:16:25.839912   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.839920   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:25.839925   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:25.839983   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:25.869299   54807 cri.go:89] found id: ""
	I1202 19:16:25.869312   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.869319   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:25.869325   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:25.869384   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:25.894364   54807 cri.go:89] found id: ""
	I1202 19:16:25.894379   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.894385   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:25.894391   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:25.894448   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:25.919717   54807 cri.go:89] found id: ""
	I1202 19:16:25.919733   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.919741   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:25.919748   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:25.919759   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:25.988177   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:25.979290   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.979818   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.981491   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.982030   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.983829   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:25.979290   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.979818   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.981491   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.982030   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.983829   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:25.988188   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:25.988198   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:26.052787   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:26.052806   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:26.081027   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:26.081042   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:26.138061   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:26.138079   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:28.650000   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:28.660481   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:28.660541   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:28.685594   54807 cri.go:89] found id: ""
	I1202 19:16:28.685608   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.685616   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:28.685621   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:28.685679   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:28.710399   54807 cri.go:89] found id: ""
	I1202 19:16:28.710412   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.710419   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:28.710425   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:28.710481   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:28.735520   54807 cri.go:89] found id: ""
	I1202 19:16:28.735533   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.735546   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:28.735551   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:28.735607   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:28.762423   54807 cri.go:89] found id: ""
	I1202 19:16:28.762436   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.762443   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:28.762449   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:28.762515   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:28.791746   54807 cri.go:89] found id: ""
	I1202 19:16:28.791760   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.791767   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:28.791772   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:28.791831   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:28.818359   54807 cri.go:89] found id: ""
	I1202 19:16:28.818372   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.818379   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:28.818386   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:28.818443   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:28.846465   54807 cri.go:89] found id: ""
	I1202 19:16:28.846479   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.846486   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:28.846494   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:28.846503   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:28.903412   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:28.903430   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:28.914210   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:28.914267   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:28.978428   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:28.970543   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.971044   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.972803   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.973286   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.974839   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:28.970543   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.971044   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.972803   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.973286   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.974839   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:28.978439   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:28.978450   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:29.041343   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:29.041363   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:31.570595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:31.583500   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:31.583573   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:31.611783   54807 cri.go:89] found id: ""
	I1202 19:16:31.611796   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.611805   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:31.611811   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:31.611868   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:31.639061   54807 cri.go:89] found id: ""
	I1202 19:16:31.639074   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.639081   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:31.639086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:31.639152   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:31.664706   54807 cri.go:89] found id: ""
	I1202 19:16:31.664719   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.664726   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:31.664732   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:31.664789   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:31.688725   54807 cri.go:89] found id: ""
	I1202 19:16:31.688739   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.688746   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:31.688751   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:31.688807   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:31.713308   54807 cri.go:89] found id: ""
	I1202 19:16:31.713321   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.713328   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:31.713333   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:31.713391   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:31.737960   54807 cri.go:89] found id: ""
	I1202 19:16:31.737973   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.737980   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:31.737985   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:31.738041   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:31.766035   54807 cri.go:89] found id: ""
	I1202 19:16:31.766048   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.766055   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:31.766063   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:31.766078   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:31.821307   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:31.821327   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:31.832103   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:31.832118   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:31.894804   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:31.886409   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.887232   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.888852   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.889449   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.891143   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:31.886409   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.887232   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.888852   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.889449   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.891143   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:31.894814   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:31.894824   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:31.958623   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:31.958641   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:34.494532   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:34.504804   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:34.504861   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:34.534339   54807 cri.go:89] found id: ""
	I1202 19:16:34.534359   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.534366   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:34.534372   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:34.534430   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:34.559181   54807 cri.go:89] found id: ""
	I1202 19:16:34.559194   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.559203   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:34.559208   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:34.559266   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:34.583120   54807 cri.go:89] found id: ""
	I1202 19:16:34.583133   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.583139   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:34.583145   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:34.583245   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:34.608256   54807 cri.go:89] found id: ""
	I1202 19:16:34.608269   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.608276   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:34.608282   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:34.608365   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:34.632733   54807 cri.go:89] found id: ""
	I1202 19:16:34.632747   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.632754   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:34.632759   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:34.632821   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:34.663293   54807 cri.go:89] found id: ""
	I1202 19:16:34.663307   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.663314   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:34.663320   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:34.663376   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:34.686842   54807 cri.go:89] found id: ""
	I1202 19:16:34.686856   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.686863   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:34.686871   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:34.686881   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:34.697549   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:34.697564   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:34.764406   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:34.756417   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.757141   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.758783   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.759285   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.760837   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:34.756417   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.757141   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.758783   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.759285   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.760837   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:34.764416   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:34.764427   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:34.827201   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:34.827223   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:34.854552   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:34.854570   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:37.413003   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:37.423382   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:37.423441   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:37.459973   54807 cri.go:89] found id: ""
	I1202 19:16:37.459987   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.459994   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:37.460000   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:37.460062   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:37.494488   54807 cri.go:89] found id: ""
	I1202 19:16:37.494503   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.494510   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:37.494515   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:37.494584   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:37.519270   54807 cri.go:89] found id: ""
	I1202 19:16:37.519283   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.519290   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:37.519295   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:37.519351   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:37.545987   54807 cri.go:89] found id: ""
	I1202 19:16:37.546001   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.546008   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:37.546013   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:37.546069   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:37.574348   54807 cri.go:89] found id: ""
	I1202 19:16:37.574362   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.574369   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:37.574375   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:37.574437   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:37.600075   54807 cri.go:89] found id: ""
	I1202 19:16:37.600089   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.600096   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:37.600102   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:37.600167   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:37.625421   54807 cri.go:89] found id: ""
	I1202 19:16:37.625434   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.625443   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:37.625450   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:37.625460   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:37.688980   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:37.689000   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:37.719329   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:37.719344   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:37.778206   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:37.778225   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:37.789133   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:37.789148   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:37.856498   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:37.848835   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.849672   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851302   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851842   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.853113   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:37.848835   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.849672   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851302   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851842   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.853113   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:40.358183   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:40.368449   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:40.368509   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:40.392705   54807 cri.go:89] found id: ""
	I1202 19:16:40.392721   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.392728   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:40.392734   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:40.392796   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:40.417408   54807 cri.go:89] found id: ""
	I1202 19:16:40.417422   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.417429   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:40.417435   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:40.417493   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:40.458012   54807 cri.go:89] found id: ""
	I1202 19:16:40.458026   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.458033   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:40.458039   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:40.458094   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:40.498315   54807 cri.go:89] found id: ""
	I1202 19:16:40.498328   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.498335   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:40.498341   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:40.498402   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:40.523770   54807 cri.go:89] found id: ""
	I1202 19:16:40.523784   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.523792   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:40.523797   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:40.523865   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:40.549124   54807 cri.go:89] found id: ""
	I1202 19:16:40.549137   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.549144   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:40.549149   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:40.549207   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:40.573667   54807 cri.go:89] found id: ""
	I1202 19:16:40.573680   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.573688   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:40.573696   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:40.573708   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:40.629671   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:40.629688   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:40.640745   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:40.640760   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:40.706165   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:40.697573   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.698405   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700020   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700368   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.702012   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:40.697573   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.698405   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700020   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700368   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.702012   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:40.706175   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:40.706186   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:40.775737   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:40.775755   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:43.307135   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:43.317487   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:43.317553   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:43.342709   54807 cri.go:89] found id: ""
	I1202 19:16:43.342722   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.342730   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:43.342735   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:43.342793   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:43.367380   54807 cri.go:89] found id: ""
	I1202 19:16:43.367393   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.367400   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:43.367406   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:43.367462   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:43.394678   54807 cri.go:89] found id: ""
	I1202 19:16:43.394691   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.394699   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:43.394704   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:43.394761   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:43.421130   54807 cri.go:89] found id: ""
	I1202 19:16:43.421144   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.421151   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:43.421156   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:43.421212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:43.454728   54807 cri.go:89] found id: ""
	I1202 19:16:43.454741   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.454749   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:43.454754   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:43.454810   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:43.491457   54807 cri.go:89] found id: ""
	I1202 19:16:43.491470   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.491477   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:43.491482   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:43.491537   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:43.515943   54807 cri.go:89] found id: ""
	I1202 19:16:43.515957   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.515964   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:43.515972   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:43.515982   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:43.579953   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:43.579972   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:43.608617   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:43.608632   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:43.666586   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:43.666604   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:43.677358   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:43.677374   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:43.741646   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:43.733470   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.734235   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.735923   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.736656   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.738348   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:43.733470   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.734235   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.735923   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.736656   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.738348   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:46.243365   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:46.255599   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:46.255658   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:46.280357   54807 cri.go:89] found id: ""
	I1202 19:16:46.280369   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.280376   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:46.280382   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:46.280444   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:46.304610   54807 cri.go:89] found id: ""
	I1202 19:16:46.304623   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.304630   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:46.304635   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:46.304692   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:46.328944   54807 cri.go:89] found id: ""
	I1202 19:16:46.328957   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.328963   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:46.328968   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:46.329027   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:46.357896   54807 cri.go:89] found id: ""
	I1202 19:16:46.357909   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.357916   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:46.357923   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:46.357981   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:46.381601   54807 cri.go:89] found id: ""
	I1202 19:16:46.381613   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.381620   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:46.381626   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:46.381687   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:46.406928   54807 cri.go:89] found id: ""
	I1202 19:16:46.406942   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.406949   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:46.406954   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:46.407009   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:46.449373   54807 cri.go:89] found id: ""
	I1202 19:16:46.449386   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.449393   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:46.449401   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:46.449411   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:46.516162   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:46.516180   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:46.527166   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:46.527183   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:46.590201   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:46.582136   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.583309   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.584090   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585115   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585711   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:46.582136   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.583309   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.584090   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585115   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585711   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:46.590211   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:46.590221   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:46.652574   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:46.652593   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:49.180131   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:49.190665   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:49.190729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:49.215295   54807 cri.go:89] found id: ""
	I1202 19:16:49.215308   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.215315   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:49.215321   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:49.215382   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:49.241898   54807 cri.go:89] found id: ""
	I1202 19:16:49.241912   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.241919   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:49.241925   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:49.241986   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:49.266638   54807 cri.go:89] found id: ""
	I1202 19:16:49.266651   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.266658   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:49.266664   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:49.266719   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:49.292478   54807 cri.go:89] found id: ""
	I1202 19:16:49.292496   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.292506   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:49.292512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:49.292589   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:49.318280   54807 cri.go:89] found id: ""
	I1202 19:16:49.318293   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.318300   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:49.318306   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:49.318373   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:49.351760   54807 cri.go:89] found id: ""
	I1202 19:16:49.351774   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.351787   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:49.351793   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:49.351854   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:49.376513   54807 cri.go:89] found id: ""
	I1202 19:16:49.376536   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.376543   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:49.376551   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:49.376563   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:49.448960   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:49.448987   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:49.482655   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:49.482673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:49.541305   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:49.541322   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:49.552971   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:49.552988   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:49.618105   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:49.609636   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.610374   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612148   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612903   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.614517   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:49.609636   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.610374   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612148   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612903   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.614517   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:52.119791   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:52.130607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:52.130669   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:52.155643   54807 cri.go:89] found id: ""
	I1202 19:16:52.155656   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.155663   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:52.155669   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:52.155729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:52.179230   54807 cri.go:89] found id: ""
	I1202 19:16:52.179244   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.179253   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:52.179259   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:52.179316   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:52.203772   54807 cri.go:89] found id: ""
	I1202 19:16:52.203785   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.203792   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:52.203798   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:52.203852   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:52.236168   54807 cri.go:89] found id: ""
	I1202 19:16:52.236183   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.236190   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:52.236196   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:52.236257   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:52.260979   54807 cri.go:89] found id: ""
	I1202 19:16:52.260995   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.261003   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:52.261008   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:52.261063   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:52.284287   54807 cri.go:89] found id: ""
	I1202 19:16:52.284299   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.284306   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:52.284312   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:52.284385   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:52.310376   54807 cri.go:89] found id: ""
	I1202 19:16:52.310390   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.310397   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:52.310405   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:52.310415   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:52.366619   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:52.366636   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:52.377556   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:52.377572   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:52.453208   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:52.444029   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.444937   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.446866   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.447608   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.449367   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:52.444029   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.444937   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.446866   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.447608   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.449367   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:52.453218   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:52.453229   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:52.524196   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:52.524214   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:55.052717   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:55.063878   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:55.063943   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:55.089568   54807 cri.go:89] found id: ""
	I1202 19:16:55.089582   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.089588   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:55.089594   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:55.089658   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:55.116741   54807 cri.go:89] found id: ""
	I1202 19:16:55.116755   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.116762   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:55.116768   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:55.116825   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:55.142748   54807 cri.go:89] found id: ""
	I1202 19:16:55.142761   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.142768   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:55.142774   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:55.142836   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:55.167341   54807 cri.go:89] found id: ""
	I1202 19:16:55.167354   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.167361   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:55.167367   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:55.167424   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:55.194118   54807 cri.go:89] found id: ""
	I1202 19:16:55.194132   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.194139   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:55.194144   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:55.194201   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:55.218379   54807 cri.go:89] found id: ""
	I1202 19:16:55.218393   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.218400   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:55.218406   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:55.218465   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:55.243035   54807 cri.go:89] found id: ""
	I1202 19:16:55.243048   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.243055   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:55.243063   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:55.243073   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:55.310493   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:55.302627   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.303246   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.304790   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.305303   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.306777   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:55.302627   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.303246   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.304790   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.305303   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.306777   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:55.310504   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:55.310517   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:55.373914   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:55.373933   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:55.405157   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:55.405172   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:55.473565   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:55.473583   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:57.986363   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:57.996902   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:57.996969   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:58.023028   54807 cri.go:89] found id: ""
	I1202 19:16:58.023042   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.023049   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:58.023055   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:58.023113   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:58.049927   54807 cri.go:89] found id: ""
	I1202 19:16:58.049941   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.049947   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:58.049953   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:58.050013   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:58.078428   54807 cri.go:89] found id: ""
	I1202 19:16:58.078448   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.078456   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:58.078461   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:58.078516   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:58.105365   54807 cri.go:89] found id: ""
	I1202 19:16:58.105377   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.105385   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:58.105390   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:58.105448   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:58.129444   54807 cri.go:89] found id: ""
	I1202 19:16:58.129458   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.129465   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:58.129470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:58.129531   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:58.157574   54807 cri.go:89] found id: ""
	I1202 19:16:58.157588   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.157594   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:58.157607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:58.157670   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:58.182028   54807 cri.go:89] found id: ""
	I1202 19:16:58.182042   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.182049   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:58.182057   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:58.182067   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:58.241166   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:58.241184   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:58.252367   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:58.252383   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:58.319914   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:58.311929   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.312824   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.314498   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.315073   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.316444   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:58.311929   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.312824   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.314498   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.315073   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.316444   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:58.319925   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:58.319937   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:58.381228   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:58.381246   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:00.909644   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:00.920924   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:00.921037   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:00.947793   54807 cri.go:89] found id: ""
	I1202 19:17:00.947812   54807 logs.go:282] 0 containers: []
	W1202 19:17:00.947820   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:00.947828   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:00.947900   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:00.975539   54807 cri.go:89] found id: ""
	I1202 19:17:00.975553   54807 logs.go:282] 0 containers: []
	W1202 19:17:00.975561   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:00.975566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:00.975629   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:01.002532   54807 cri.go:89] found id: ""
	I1202 19:17:01.002549   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.002560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:01.002566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:01.002636   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:01.032211   54807 cri.go:89] found id: ""
	I1202 19:17:01.032226   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.032233   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:01.032239   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:01.032302   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:01.059398   54807 cri.go:89] found id: ""
	I1202 19:17:01.059413   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.059420   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:01.059426   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:01.059486   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:01.091722   54807 cri.go:89] found id: ""
	I1202 19:17:01.091740   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.091746   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:01.091752   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:01.091816   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:01.117849   54807 cri.go:89] found id: ""
	I1202 19:17:01.117864   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.117871   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:01.117879   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:01.117893   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:01.191972   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:01.182202   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.183119   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.185191   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.186030   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.187874   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:01.182202   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.183119   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.185191   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.186030   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.187874   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:01.191984   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:01.191997   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:01.260783   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:01.260806   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:01.290665   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:01.290683   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:01.348633   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:01.348653   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:03.860845   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:03.871899   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:03.871966   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:03.899158   54807 cri.go:89] found id: ""
	I1202 19:17:03.899172   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.899179   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:03.899185   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:03.899244   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:03.925147   54807 cri.go:89] found id: ""
	I1202 19:17:03.925161   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.925168   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:03.925174   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:03.925235   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:03.955130   54807 cri.go:89] found id: ""
	I1202 19:17:03.955143   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.955150   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:03.955156   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:03.955215   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:03.983272   54807 cri.go:89] found id: ""
	I1202 19:17:03.983286   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.983294   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:03.983300   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:03.983371   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:04.009435   54807 cri.go:89] found id: ""
	I1202 19:17:04.009449   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.009456   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:04.009463   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:04.009523   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:04.037346   54807 cri.go:89] found id: ""
	I1202 19:17:04.037360   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.037368   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:04.037374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:04.037433   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:04.066662   54807 cri.go:89] found id: ""
	I1202 19:17:04.066675   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.066682   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:04.066690   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:04.066701   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:04.125350   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:04.125369   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:04.136698   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:04.136716   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:04.206327   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:04.198119   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.198761   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.200411   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.201024   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.202642   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:04.198119   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.198761   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.200411   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.201024   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.202642   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:04.206338   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:04.206353   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:04.274588   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:04.274608   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:06.806010   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:06.817189   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:06.817256   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:06.843114   54807 cri.go:89] found id: ""
	I1202 19:17:06.843129   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.843136   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:06.843142   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:06.843218   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:06.873921   54807 cri.go:89] found id: ""
	I1202 19:17:06.873947   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.873955   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:06.873961   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:06.874045   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:06.900636   54807 cri.go:89] found id: ""
	I1202 19:17:06.900651   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.900658   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:06.900664   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:06.900724   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:06.928484   54807 cri.go:89] found id: ""
	I1202 19:17:06.928504   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.928512   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:06.928518   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:06.928583   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:06.956137   54807 cri.go:89] found id: ""
	I1202 19:17:06.956170   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.956179   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:06.956184   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:06.956258   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:06.987383   54807 cri.go:89] found id: ""
	I1202 19:17:06.987408   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.987416   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:06.987422   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:06.987495   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:07.013712   54807 cri.go:89] found id: ""
	I1202 19:17:07.013726   54807 logs.go:282] 0 containers: []
	W1202 19:17:07.013733   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:07.013741   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:07.013756   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:07.076937   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:07.076955   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:07.106847   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:07.106863   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:07.164565   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:07.164584   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:07.177132   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:07.177154   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:07.245572   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:07.237375   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.238081   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.239663   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.240205   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.241966   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:07.237375   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.238081   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.239663   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.240205   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.241966   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:09.745822   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:09.756122   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:09.756180   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:09.784649   54807 cri.go:89] found id: ""
	I1202 19:17:09.784663   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.784670   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:09.784675   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:09.784732   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:09.809632   54807 cri.go:89] found id: ""
	I1202 19:17:09.809655   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.809662   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:09.809668   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:09.809733   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:09.839403   54807 cri.go:89] found id: ""
	I1202 19:17:09.839425   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.839433   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:09.839439   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:09.839504   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:09.868977   54807 cri.go:89] found id: ""
	I1202 19:17:09.868991   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.868999   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:09.869004   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:09.869064   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:09.894156   54807 cri.go:89] found id: ""
	I1202 19:17:09.894170   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.894176   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:09.894182   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:09.894237   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:09.919174   54807 cri.go:89] found id: ""
	I1202 19:17:09.919188   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.919195   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:09.919200   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:09.919261   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:09.944620   54807 cri.go:89] found id: ""
	I1202 19:17:09.944632   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.944639   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:09.944647   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:09.944657   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:10.004028   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:10.004049   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:10.015962   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:10.015979   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:10.086133   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:10.078544   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.079196   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.080924   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.081452   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.082613   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:10.078544   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.079196   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.080924   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.081452   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.082613   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:10.086143   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:10.086153   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:10.148419   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:10.148437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:12.676458   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:12.687083   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:12.687155   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:12.712590   54807 cri.go:89] found id: ""
	I1202 19:17:12.712604   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.712611   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:12.712616   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:12.712674   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:12.737565   54807 cri.go:89] found id: ""
	I1202 19:17:12.737578   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.737585   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:12.737591   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:12.737648   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:12.762201   54807 cri.go:89] found id: ""
	I1202 19:17:12.762216   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.762223   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:12.762228   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:12.762288   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:12.786736   54807 cri.go:89] found id: ""
	I1202 19:17:12.786750   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.786758   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:12.786763   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:12.786825   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:12.811994   54807 cri.go:89] found id: ""
	I1202 19:17:12.812008   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.812015   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:12.812020   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:12.812078   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:12.838580   54807 cri.go:89] found id: ""
	I1202 19:17:12.838593   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.838600   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:12.838605   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:12.838659   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:12.863652   54807 cri.go:89] found id: ""
	I1202 19:17:12.863665   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.863672   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:12.863679   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:12.863689   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:12.918766   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:12.918784   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:12.930406   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:12.930428   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:13.000633   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:12.992135   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.992970   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994592   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994977   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.996559   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:12.992135   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.992970   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994592   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994977   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.996559   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:13.000643   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:13.000655   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:13.065384   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:13.065403   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:15.594382   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:15.604731   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:15.604795   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:15.634332   54807 cri.go:89] found id: ""
	I1202 19:17:15.634345   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.634353   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:15.634358   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:15.634434   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:15.663126   54807 cri.go:89] found id: ""
	I1202 19:17:15.663141   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.663148   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:15.663153   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:15.663217   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:15.699033   54807 cri.go:89] found id: ""
	I1202 19:17:15.699051   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.699059   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:15.699065   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:15.699121   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:15.727044   54807 cri.go:89] found id: ""
	I1202 19:17:15.727057   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.727065   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:15.727071   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:15.727129   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:15.754131   54807 cri.go:89] found id: ""
	I1202 19:17:15.754152   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.754159   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:15.754165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:15.754224   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:15.778325   54807 cri.go:89] found id: ""
	I1202 19:17:15.778338   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.778345   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:15.778350   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:15.778407   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:15.803363   54807 cri.go:89] found id: ""
	I1202 19:17:15.803376   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.803383   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:15.803391   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:15.803403   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:15.814039   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:15.814055   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:15.885494   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:15.877151   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.877774   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.878766   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880347   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880955   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:15.877151   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.877774   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.878766   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880347   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880955   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:15.885505   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:15.885516   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:15.947276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:15.947295   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:15.979963   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:15.979981   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:18.538313   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:18.548423   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:18.548490   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:18.571700   54807 cri.go:89] found id: ""
	I1202 19:17:18.571714   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.571721   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:18.571726   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:18.571784   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:18.600197   54807 cri.go:89] found id: ""
	I1202 19:17:18.600211   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.600219   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:18.600224   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:18.600279   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:18.628309   54807 cri.go:89] found id: ""
	I1202 19:17:18.628341   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.628348   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:18.628353   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:18.628440   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:18.654241   54807 cri.go:89] found id: ""
	I1202 19:17:18.654255   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.654263   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:18.654268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:18.654325   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:18.690109   54807 cri.go:89] found id: ""
	I1202 19:17:18.690123   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.690130   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:18.690135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:18.690194   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:18.719625   54807 cri.go:89] found id: ""
	I1202 19:17:18.719638   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.719646   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:18.719651   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:18.719713   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:18.753094   54807 cri.go:89] found id: ""
	I1202 19:17:18.753108   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.753116   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:18.753124   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:18.753135   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:18.782592   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:18.782608   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:18.837738   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:18.837757   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:18.848921   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:18.848937   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:18.918012   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:18.908756   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.909735   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.911590   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.912295   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.913925   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:18.908756   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.909735   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.911590   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.912295   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.913925   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:18.918023   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:18.918034   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:21.481252   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:21.491493   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:21.491550   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:21.515967   54807 cri.go:89] found id: ""
	I1202 19:17:21.515980   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.515987   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:21.515993   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:21.516049   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:21.545239   54807 cri.go:89] found id: ""
	I1202 19:17:21.545256   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.545263   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:21.545268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:21.545349   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:21.574561   54807 cri.go:89] found id: ""
	I1202 19:17:21.574575   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.574582   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:21.574588   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:21.574643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:21.600546   54807 cri.go:89] found id: ""
	I1202 19:17:21.600567   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.600575   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:21.600581   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:21.600647   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:21.625602   54807 cri.go:89] found id: ""
	I1202 19:17:21.625616   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.625623   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:21.625629   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:21.625691   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:21.650573   54807 cri.go:89] found id: ""
	I1202 19:17:21.650586   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.650593   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:21.650599   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:21.650655   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:21.680099   54807 cri.go:89] found id: ""
	I1202 19:17:21.680113   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.680120   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:21.680128   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:21.680155   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:21.750582   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:21.750601   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:21.762564   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:21.762580   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:21.827497   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:21.819360   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.820080   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.821817   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.822472   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.824049   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:21.819360   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.820080   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.821817   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.822472   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.824049   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:21.827507   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:21.827518   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:21.889794   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:21.889812   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:24.421754   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:24.432162   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:24.432233   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:24.456800   54807 cri.go:89] found id: ""
	I1202 19:17:24.456814   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.456821   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:24.456826   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:24.456901   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:24.481502   54807 cri.go:89] found id: ""
	I1202 19:17:24.481516   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.481523   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:24.481529   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:24.481587   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:24.505876   54807 cri.go:89] found id: ""
	I1202 19:17:24.505918   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.505925   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:24.505931   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:24.505990   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:24.530651   54807 cri.go:89] found id: ""
	I1202 19:17:24.530665   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.530673   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:24.530689   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:24.530749   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:24.556247   54807 cri.go:89] found id: ""
	I1202 19:17:24.556260   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.556277   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:24.556283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:24.556391   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:24.585748   54807 cri.go:89] found id: ""
	I1202 19:17:24.585761   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.585769   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:24.585774   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:24.585833   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:24.610350   54807 cri.go:89] found id: ""
	I1202 19:17:24.610363   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.610370   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:24.610377   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:24.610388   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:24.680866   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:24.664630   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.665302   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667106   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667645   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.669330   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:24.664630   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.665302   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667106   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667645   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.669330   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:24.680876   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:24.680887   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:24.756955   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:24.756975   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:24.784854   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:24.784869   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:24.849848   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:24.849872   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:27.361613   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:27.375047   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:27.375145   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:27.399753   54807 cri.go:89] found id: ""
	I1202 19:17:27.399767   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.399774   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:27.399780   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:27.399838   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:27.430016   54807 cri.go:89] found id: ""
	I1202 19:17:27.430030   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.430037   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:27.430043   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:27.430102   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:27.455165   54807 cri.go:89] found id: ""
	I1202 19:17:27.455178   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.455186   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:27.455191   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:27.455251   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:27.481353   54807 cri.go:89] found id: ""
	I1202 19:17:27.481367   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.481374   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:27.481380   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:27.481437   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:27.505602   54807 cri.go:89] found id: ""
	I1202 19:17:27.505615   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.505622   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:27.505627   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:27.505685   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:27.531062   54807 cri.go:89] found id: ""
	I1202 19:17:27.531075   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.531082   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:27.531087   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:27.531143   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:27.556614   54807 cri.go:89] found id: ""
	I1202 19:17:27.556628   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.556635   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:27.556642   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:27.556653   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:27.623535   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:27.615982   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.616782   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618353   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618650   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.620147   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:27.615982   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.616782   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618353   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618650   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.620147   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:27.623546   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:27.623557   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:27.692276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:27.692294   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:27.728468   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:27.728489   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:27.790653   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:27.790670   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:30.302100   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:30.313066   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:30.313144   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:30.340123   54807 cri.go:89] found id: ""
	I1202 19:17:30.340137   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.340144   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:30.340149   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:30.340208   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:30.365806   54807 cri.go:89] found id: ""
	I1202 19:17:30.365820   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.365835   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:30.365841   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:30.365904   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:30.391688   54807 cri.go:89] found id: ""
	I1202 19:17:30.391701   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.391708   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:30.391714   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:30.391771   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:30.416982   54807 cri.go:89] found id: ""
	I1202 19:17:30.416996   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.417013   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:30.417019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:30.417117   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:30.443139   54807 cri.go:89] found id: ""
	I1202 19:17:30.443153   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.443162   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:30.443168   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:30.443226   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:30.468557   54807 cri.go:89] found id: ""
	I1202 19:17:30.468571   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.468579   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:30.468584   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:30.468641   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:30.494467   54807 cri.go:89] found id: ""
	I1202 19:17:30.494480   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.494488   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:30.494502   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:30.494515   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:30.551986   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:30.552005   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:30.563168   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:30.563184   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:30.628562   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:30.620608   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.621322   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623022   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623454   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.625008   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:30.620608   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.621322   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623022   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623454   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.625008   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:30.628573   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:30.628584   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:30.691460   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:30.691478   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:33.223672   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:33.234425   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:33.234485   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:33.262500   54807 cri.go:89] found id: ""
	I1202 19:17:33.262514   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.262521   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:33.262527   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:33.262590   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:33.287888   54807 cri.go:89] found id: ""
	I1202 19:17:33.287902   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.287921   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:33.287926   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:33.287995   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:33.314581   54807 cri.go:89] found id: ""
	I1202 19:17:33.314594   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.314601   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:33.314607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:33.314671   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:33.338734   54807 cri.go:89] found id: ""
	I1202 19:17:33.338747   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.338755   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:33.338760   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:33.338818   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:33.363343   54807 cri.go:89] found id: ""
	I1202 19:17:33.363356   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.363363   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:33.363369   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:33.363425   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:33.388256   54807 cri.go:89] found id: ""
	I1202 19:17:33.388270   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.388277   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:33.388283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:33.388360   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:33.412424   54807 cri.go:89] found id: ""
	I1202 19:17:33.412449   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.412456   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:33.412465   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:33.412475   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:33.467817   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:33.467835   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:33.479194   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:33.479209   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:33.548484   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:33.540299   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.540851   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.542579   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.543074   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.544871   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:33.540299   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.540851   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.542579   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.543074   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.544871   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:33.548494   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:33.548505   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:33.612889   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:33.612909   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:36.146985   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:36.158019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:36.158079   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:36.188906   54807 cri.go:89] found id: ""
	I1202 19:17:36.188919   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.188932   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:36.188938   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:36.188996   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:36.213390   54807 cri.go:89] found id: ""
	I1202 19:17:36.213404   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.213411   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:36.213416   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:36.213481   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:36.242801   54807 cri.go:89] found id: ""
	I1202 19:17:36.242814   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.242822   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:36.242827   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:36.242882   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:36.269121   54807 cri.go:89] found id: ""
	I1202 19:17:36.269142   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.269149   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:36.269155   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:36.269212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:36.295182   54807 cri.go:89] found id: ""
	I1202 19:17:36.295196   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.295203   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:36.295208   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:36.295265   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:36.320684   54807 cri.go:89] found id: ""
	I1202 19:17:36.320698   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.320705   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:36.320711   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:36.320783   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:36.347524   54807 cri.go:89] found id: ""
	I1202 19:17:36.347537   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.347545   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:36.347553   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:36.347564   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:36.358349   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:36.358364   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:36.419970   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:36.412170   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.412950   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414493   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414845   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.416368   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:36.412170   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.412950   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414493   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414845   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.416368   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:36.419980   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:36.419991   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:36.482180   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:36.482199   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:36.511443   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:36.511458   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:39.067437   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:39.077694   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:39.077763   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:39.102742   54807 cri.go:89] found id: ""
	I1202 19:17:39.102755   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.102762   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:39.102768   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:39.102824   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:39.127352   54807 cri.go:89] found id: ""
	I1202 19:17:39.127365   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.127371   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:39.127376   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:39.127433   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:39.155704   54807 cri.go:89] found id: ""
	I1202 19:17:39.155717   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.155725   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:39.155730   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:39.155793   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:39.181102   54807 cri.go:89] found id: ""
	I1202 19:17:39.181121   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.181128   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:39.181133   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:39.181193   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:39.204855   54807 cri.go:89] found id: ""
	I1202 19:17:39.204869   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.204876   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:39.204881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:39.204936   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:39.228875   54807 cri.go:89] found id: ""
	I1202 19:17:39.228889   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.228896   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:39.228901   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:39.228961   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:39.254647   54807 cri.go:89] found id: ""
	I1202 19:17:39.254661   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.254668   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:39.254681   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:39.254696   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:39.266611   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:39.266628   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:39.329195   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:39.321572   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.322010   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323495   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323811   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.325283   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:39.321572   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.322010   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323495   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323811   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.325283   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:39.329204   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:39.329215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:39.390326   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:39.390345   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:39.419151   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:39.419176   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:41.975528   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:41.989057   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:41.989132   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:42.018363   54807 cri.go:89] found id: ""
	I1202 19:17:42.018376   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.018384   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:42.018390   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:42.018453   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:42.045176   54807 cri.go:89] found id: ""
	I1202 19:17:42.045192   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.045200   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:42.045206   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:42.045290   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:42.075758   54807 cri.go:89] found id: ""
	I1202 19:17:42.075773   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.075781   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:42.075787   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:42.075856   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:42.111739   54807 cri.go:89] found id: ""
	I1202 19:17:42.111754   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.111760   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:42.111767   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:42.111829   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:42.141340   54807 cri.go:89] found id: ""
	I1202 19:17:42.141358   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.141368   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:42.141374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:42.141453   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:42.171125   54807 cri.go:89] found id: ""
	I1202 19:17:42.171140   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.171159   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:42.171166   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:42.171236   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:42.200254   54807 cri.go:89] found id: ""
	I1202 19:17:42.200272   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.200280   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:42.200292   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:42.200307   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:42.256751   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:42.256772   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:42.269101   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:42.269118   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:42.336339   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:42.328432   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.329095   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.330828   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.331361   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.332853   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:42.328432   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.329095   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.330828   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.331361   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.332853   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:42.336350   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:42.336361   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:42.397522   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:42.397540   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:44.932481   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:44.944310   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:44.944439   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:44.992545   54807 cri.go:89] found id: ""
	I1202 19:17:44.992561   54807 logs.go:282] 0 containers: []
	W1202 19:17:44.992568   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:44.992574   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:44.992643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:45.041739   54807 cri.go:89] found id: ""
	I1202 19:17:45.041756   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.041764   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:45.041770   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:45.041849   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:45.083378   54807 cri.go:89] found id: ""
	I1202 19:17:45.083394   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.083402   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:45.083407   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:45.083483   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:45.119179   54807 cri.go:89] found id: ""
	I1202 19:17:45.119206   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.119214   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:45.119220   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:45.119340   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:45.156515   54807 cri.go:89] found id: ""
	I1202 19:17:45.156574   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.156583   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:45.156590   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:45.156760   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:45.195862   54807 cri.go:89] found id: ""
	I1202 19:17:45.195877   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.195885   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:45.195892   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:45.195968   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:45.229425   54807 cri.go:89] found id: ""
	I1202 19:17:45.229448   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.229457   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:45.229466   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:45.229477   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:45.293109   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:45.293125   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:45.303969   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:45.303985   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:45.371653   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:45.362842   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.363639   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.365366   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.366008   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.367671   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:45.362842   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.363639   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.365366   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.366008   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.367671   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:45.371662   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:45.371673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:45.436450   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:45.436469   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:47.967684   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:47.979933   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:47.980001   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:48.006489   54807 cri.go:89] found id: ""
	I1202 19:17:48.006503   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.006511   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:48.006517   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:48.006580   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:48.035723   54807 cri.go:89] found id: ""
	I1202 19:17:48.035737   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.035745   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:48.035760   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:48.035820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:48.065220   54807 cri.go:89] found id: ""
	I1202 19:17:48.065233   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.065251   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:48.065260   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:48.065332   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:48.088782   54807 cri.go:89] found id: ""
	I1202 19:17:48.088796   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.088803   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:48.088809   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:48.088865   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:48.113775   54807 cri.go:89] found id: ""
	I1202 19:17:48.113788   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.113799   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:48.113808   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:48.113867   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:48.140235   54807 cri.go:89] found id: ""
	I1202 19:17:48.140248   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.140254   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:48.140260   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:48.140315   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:48.166089   54807 cri.go:89] found id: ""
	I1202 19:17:48.166102   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.166108   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:48.166116   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:48.166126   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:48.192826   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:48.192842   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:48.248078   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:48.248098   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:48.258722   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:48.258737   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:48.323436   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:48.316144   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.316536   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318116   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318415   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.319885   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:48.316144   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.316536   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318116   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318415   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.319885   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:48.323445   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:48.323456   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:50.885477   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:50.895878   54807 kubeadm.go:602] duration metric: took 4m3.997047772s to restartPrimaryControlPlane
	W1202 19:17:50.895945   54807 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1202 19:17:50.896022   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 19:17:51.304711   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:17:51.317725   54807 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 19:17:51.325312   54807 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 19:17:51.325381   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:17:51.332895   54807 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 19:17:51.332904   54807 kubeadm.go:158] found existing configuration files:
	
	I1202 19:17:51.332954   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:17:51.340776   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 19:17:51.340830   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 19:17:51.348141   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:17:51.355804   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 19:17:51.355867   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:17:51.363399   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:17:51.371055   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 19:17:51.371110   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:17:51.378528   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:17:51.386558   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 19:17:51.386618   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:17:51.394349   54807 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 19:17:51.435339   54807 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 19:17:51.435446   54807 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 19:17:51.512672   54807 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 19:17:51.512738   54807 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 19:17:51.512772   54807 kubeadm.go:319] OS: Linux
	I1202 19:17:51.512816   54807 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 19:17:51.512863   54807 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 19:17:51.512909   54807 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 19:17:51.512961   54807 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 19:17:51.513009   54807 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 19:17:51.513055   54807 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 19:17:51.513099   54807 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 19:17:51.513146   54807 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 19:17:51.513190   54807 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 19:17:51.580412   54807 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 19:17:51.580517   54807 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 19:17:51.580607   54807 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 19:17:51.588752   54807 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 19:17:51.594117   54807 out.go:252]   - Generating certificates and keys ...
	I1202 19:17:51.594201   54807 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 19:17:51.594273   54807 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 19:17:51.594354   54807 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 19:17:51.594424   54807 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 19:17:51.594494   54807 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 19:17:51.594547   54807 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 19:17:51.594610   54807 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 19:17:51.594671   54807 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 19:17:51.594744   54807 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 19:17:51.594818   54807 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 19:17:51.594855   54807 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 19:17:51.594910   54807 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 19:17:51.705531   54807 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 19:17:51.854203   54807 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 19:17:52.029847   54807 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 19:17:52.545269   54807 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 19:17:52.727822   54807 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 19:17:52.728412   54807 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 19:17:52.730898   54807 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 19:17:52.734122   54807 out.go:252]   - Booting up control plane ...
	I1202 19:17:52.734222   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 19:17:52.734305   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 19:17:52.734375   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 19:17:52.754118   54807 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 19:17:52.754386   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 19:17:52.762146   54807 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 19:17:52.762405   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 19:17:52.762460   54807 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 19:17:52.891581   54807 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 19:17:52.891694   54807 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 19:21:52.892779   54807 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001197768s
	I1202 19:21:52.892808   54807 kubeadm.go:319] 
	I1202 19:21:52.892871   54807 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 19:21:52.892903   54807 kubeadm.go:319] 	- The kubelet is not running
	I1202 19:21:52.893025   54807 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 19:21:52.893030   54807 kubeadm.go:319] 
	I1202 19:21:52.893133   54807 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 19:21:52.893170   54807 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 19:21:52.893200   54807 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 19:21:52.893203   54807 kubeadm.go:319] 
	I1202 19:21:52.897451   54807 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 19:21:52.897878   54807 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 19:21:52.897986   54807 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 19:21:52.898220   54807 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 19:21:52.898225   54807 kubeadm.go:319] 
	I1202 19:21:52.898299   54807 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1202 19:21:52.898412   54807 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001197768s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1202 19:21:52.898501   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 19:21:53.323346   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:21:53.337542   54807 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 19:21:53.337600   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:21:53.345331   54807 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 19:21:53.345341   54807 kubeadm.go:158] found existing configuration files:
	
	I1202 19:21:53.345394   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:21:53.352948   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 19:21:53.353002   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 19:21:53.360251   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:21:53.367769   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 19:21:53.367833   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:21:53.375319   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:21:53.383107   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 19:21:53.383164   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:21:53.390823   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:21:53.398923   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 19:21:53.398982   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:21:53.406858   54807 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 19:21:53.455640   54807 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 19:21:53.455689   54807 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 19:21:53.530940   54807 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 19:21:53.531008   54807 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 19:21:53.531042   54807 kubeadm.go:319] OS: Linux
	I1202 19:21:53.531086   54807 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 19:21:53.531133   54807 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 19:21:53.531179   54807 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 19:21:53.531226   54807 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 19:21:53.531273   54807 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 19:21:53.531320   54807 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 19:21:53.531364   54807 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 19:21:53.531410   54807 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 19:21:53.531455   54807 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 19:21:53.605461   54807 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 19:21:53.605584   54807 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 19:21:53.605706   54807 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 19:21:53.611090   54807 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 19:21:53.616552   54807 out.go:252]   - Generating certificates and keys ...
	I1202 19:21:53.616667   54807 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 19:21:53.616734   54807 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 19:21:53.616826   54807 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 19:21:53.616887   54807 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 19:21:53.616955   54807 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 19:21:53.617008   54807 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 19:21:53.617070   54807 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 19:21:53.617132   54807 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 19:21:53.617207   54807 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 19:21:53.617278   54807 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 19:21:53.617314   54807 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 19:21:53.617369   54807 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 19:21:53.704407   54807 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 19:21:53.921613   54807 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 19:21:54.521217   54807 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 19:21:54.609103   54807 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 19:21:54.800380   54807 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 19:21:54.800923   54807 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 19:21:54.803676   54807 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 19:21:54.806989   54807 out.go:252]   - Booting up control plane ...
	I1202 19:21:54.807091   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 19:21:54.807173   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 19:21:54.807243   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 19:21:54.831648   54807 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 19:21:54.831750   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 19:21:54.839547   54807 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 19:21:54.840014   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 19:21:54.840081   54807 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 19:21:54.986075   54807 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 19:21:54.986189   54807 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 19:25:54.986676   54807 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001082452s
	I1202 19:25:54.986700   54807 kubeadm.go:319] 
	I1202 19:25:54.986752   54807 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 19:25:54.986782   54807 kubeadm.go:319] 	- The kubelet is not running
	I1202 19:25:54.986880   54807 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 19:25:54.986884   54807 kubeadm.go:319] 
	I1202 19:25:54.986982   54807 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 19:25:54.987011   54807 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 19:25:54.987040   54807 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 19:25:54.987043   54807 kubeadm.go:319] 
	I1202 19:25:54.991498   54807 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 19:25:54.991923   54807 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 19:25:54.992031   54807 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 19:25:54.992264   54807 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 19:25:54.992269   54807 kubeadm.go:319] 
	I1202 19:25:54.992355   54807 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1202 19:25:54.992407   54807 kubeadm.go:403] duration metric: took 12m8.130118214s to StartCluster
	I1202 19:25:54.992437   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:25:54.992498   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:25:55.018059   54807 cri.go:89] found id: ""
	I1202 19:25:55.018073   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.018079   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:25:55.018085   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:25:55.018141   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:25:55.046728   54807 cri.go:89] found id: ""
	I1202 19:25:55.046741   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.046749   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:25:55.046755   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:25:55.046820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:25:55.073607   54807 cri.go:89] found id: ""
	I1202 19:25:55.073621   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.073629   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:25:55.073638   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:25:55.073698   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:25:55.098149   54807 cri.go:89] found id: ""
	I1202 19:25:55.098163   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.098170   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:25:55.098175   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:25:55.098231   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:25:55.126700   54807 cri.go:89] found id: ""
	I1202 19:25:55.126714   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.126721   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:25:55.126727   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:25:55.126783   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:25:55.151684   54807 cri.go:89] found id: ""
	I1202 19:25:55.151697   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.151704   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:25:55.151718   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:25:55.151776   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:25:55.179814   54807 cri.go:89] found id: ""
	I1202 19:25:55.179827   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.179834   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:25:55.179842   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:25:55.179852   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:25:55.209677   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:25:55.209693   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:25:55.267260   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:25:55.267277   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:25:55.278280   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:25:55.278301   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:25:55.341995   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:25:55.334367   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.334759   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336266   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336611   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.338027   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:25:55.334367   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.334759   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336266   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336611   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.338027   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:25:55.342006   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:25:55.342016   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	W1202 19:25:55.404636   54807 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1202 19:25:55.404681   54807 out.go:285] * 
	W1202 19:25:55.404792   54807 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 19:25:55.404837   54807 out.go:285] * 
	W1202 19:25:55.406981   54807 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 19:25:55.412566   54807 out.go:203] 
	W1202 19:25:55.416194   54807 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 19:25:55.416239   54807 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1202 19:25:55.416259   54807 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1202 19:25:55.420152   54807 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578599853Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578662622Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578725071Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578783803Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578855024Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578924119Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578991820Z" level=info msg="runtime interface created"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579043832Z" level=info msg="created NRI interface"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579105847Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579207451Z" level=info msg="Connect containerd service"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579595759Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.580416453Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590441353Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590507150Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590537673Z" level=info msg="Start subscribing containerd event"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590591277Z" level=info msg="Start recovering state"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614386130Z" level=info msg="Start event monitor"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614577326Z" level=info msg="Start cni network conf syncer for default"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614677601Z" level=info msg="Start streaming server"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614762451Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614968071Z" level=info msg="runtime interface starting up..."
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615037774Z" level=info msg="starting plugins..."
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615100272Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615329048Z" level=info msg="containerd successfully booted in 0.058232s"
	Dec 02 19:13:45 functional-449836 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:25:58.796356   21777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:58.797254   21777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:58.798758   21777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:58.799272   21777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:58.800957   21777 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:25:58 up  1:08,  0 user,  load average: 0.03, 0.15, 0.33
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:25:55 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:25:56 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 02 19:25:56 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:56 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:56 functional-449836 kubelet[21593]: E1202 19:25:56.480645   21593 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:25:56 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:25:56 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:25:57 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 323.
	Dec 02 19:25:57 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:57 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:57 functional-449836 kubelet[21652]: E1202 19:25:57.264833   21652 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:25:57 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:25:57 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:25:57 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 324.
	Dec 02 19:25:57 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:57 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:57 functional-449836 kubelet[21686]: E1202 19:25:57.982576   21686 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:25:57 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:25:57 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:25:58 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 325.
	Dec 02 19:25:58 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:58 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:25:58 functional-449836 kubelet[21764]: E1202 19:25:58.726565   21764 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:25:58 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:25:58 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (362.392356ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/ComponentHealth (2.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-449836 apply -f testdata/invalidsvc.yaml
functional_test.go:2326: (dbg) Non-zero exit: kubectl --context functional-449836 apply -f testdata/invalidsvc.yaml: exit status 1 (61.35705ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/invalidsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test.go:2328: kubectl --context functional-449836 apply -f testdata/invalidsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/InvalidService (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd (1.71s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-449836 --alsologtostderr -v=1]
functional_test.go:933: output didn't produce a URL
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-449836 --alsologtostderr -v=1] ...
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-449836 --alsologtostderr -v=1] stdout:
functional_test.go:925: (dbg) [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-449836 --alsologtostderr -v=1] stderr:
I1202 19:27:54.622092   72154 out.go:360] Setting OutFile to fd 1 ...
I1202 19:27:54.622244   72154 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:27:54.622255   72154 out.go:374] Setting ErrFile to fd 2...
I1202 19:27:54.622260   72154 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:27:54.622616   72154 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 19:27:54.622965   72154 mustload.go:66] Loading cluster: functional-449836
I1202 19:27:54.623655   72154 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:27:54.624303   72154 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
I1202 19:27:54.641892   72154 host.go:66] Checking if "functional-449836" exists ...
I1202 19:27:54.642239   72154 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1202 19:27:54.699223   72154 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:27:54.689441488 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1202 19:27:54.699342   72154 api_server.go:166] Checking apiserver status ...
I1202 19:27:54.699408   72154 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1202 19:27:54.699452   72154 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
I1202 19:27:54.716257   72154 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
W1202 19:27:54.826088   72154 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1202 19:27:54.829234   72154 out.go:179] * The control-plane node functional-449836 apiserver is not running: (state=Stopped)
I1202 19:27:54.832081   72154 out.go:179]   To start a cluster, run: "minikube start -p functional-449836"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (328.244616ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                                        ARGS                                                                         │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ service   │ functional-449836 service hello-node --url                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount     │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001:/mount-9p --alsologtostderr -v=1              │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh       │ functional-449836 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh       │ functional-449836 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh -- ls -la /mount-9p                                                                                                           │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh cat /mount-9p/test-1764703664487000285                                                                                        │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                    │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh       │ functional-449836 ssh sudo umount -f /mount-9p                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount     │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo265742512/001:/mount-9p --alsologtostderr -v=1 --port 46464  │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh       │ functional-449836 ssh findmnt -T /mount-9p | grep 9p                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh -- ls -la /mount-9p                                                                                                           │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh sudo umount -f /mount-9p                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh       │ functional-449836 ssh findmnt -T /mount1                                                                                                            │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount     │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount1 --alsologtostderr -v=1                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount     │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount3 --alsologtostderr -v=1                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount     │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount2 --alsologtostderr -v=1                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh       │ functional-449836 ssh findmnt -T /mount1                                                                                                            │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh findmnt -T /mount2                                                                                                            │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh findmnt -T /mount3                                                                                                            │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ mount     │ -p functional-449836 --kill=true                                                                                                                    │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ start     │ -p functional-449836 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ start     │ -p functional-449836 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ start     │ -p functional-449836 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0           │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-449836 --alsologtostderr -v=1                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	└───────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:27:54
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:27:54.334705   72077 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:27:54.334826   72077 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:27:54.334832   72077 out.go:374] Setting ErrFile to fd 2...
	I1202 19:27:54.334838   72077 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:27:54.335121   72077 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:27:54.335475   72077 out.go:368] Setting JSON to false
	I1202 19:27:54.336281   72077 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":4211,"bootTime":1764699464,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:27:54.336407   72077 start.go:143] virtualization:  
	I1202 19:27:54.339578   72077 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:27:54.342634   72077 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:27:54.342710   72077 notify.go:221] Checking for updates...
	I1202 19:27:54.349520   72077 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:27:54.352529   72077 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:27:54.355602   72077 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:27:54.358596   72077 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:27:54.361481   72077 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:27:54.364941   72077 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:27:54.365634   72077 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:27:54.401377   72077 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:27:54.401513   72077 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:27:54.491357   72077 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:27:54.481540534 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:27:54.491466   72077 docker.go:319] overlay module found
	I1202 19:27:54.494502   72077 out.go:179] * Using the docker driver based on existing profile
	I1202 19:27:54.497312   72077 start.go:309] selected driver: docker
	I1202 19:27:54.497331   72077 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:27:54.497422   72077 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:27:54.497523   72077 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:27:54.562952   72077 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:27:54.553803986 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:27:54.563435   72077 cni.go:84] Creating CNI manager for ""
	I1202 19:27:54.563498   72077 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:27:54.563539   72077 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:27:54.568468   72077 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578599853Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578662622Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578725071Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578783803Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578855024Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578924119Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578991820Z" level=info msg="runtime interface created"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579043832Z" level=info msg="created NRI interface"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579105847Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579207451Z" level=info msg="Connect containerd service"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579595759Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.580416453Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590441353Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590507150Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590537673Z" level=info msg="Start subscribing containerd event"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590591277Z" level=info msg="Start recovering state"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614386130Z" level=info msg="Start event monitor"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614577326Z" level=info msg="Start cni network conf syncer for default"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614677601Z" level=info msg="Start streaming server"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614762451Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614968071Z" level=info msg="runtime interface starting up..."
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615037774Z" level=info msg="starting plugins..."
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615100272Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615329048Z" level=info msg="containerd successfully booted in 0.058232s"
	Dec 02 19:13:45 functional-449836 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:27:55.874399   23802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:55.875163   23802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:55.876788   23802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:55.877369   23802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:55.878963   23802 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:27:55 up  1:10,  0 user,  load average: 0.67, 0.30, 0.36
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:27:52 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:52 functional-449836 kubelet[23644]: E1202 19:27:52.983586   23644 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:52 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:52 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:53 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 478.
	Dec 02 19:27:53 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:53 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:53 functional-449836 kubelet[23665]: E1202 19:27:53.741720   23665 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:53 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:53 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:54 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 479.
	Dec 02 19:27:54 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:54 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:54 functional-449836 kubelet[23685]: E1202 19:27:54.504386   23685 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:54 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:54 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:55 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 480.
	Dec 02 19:27:55 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:55 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:55 functional-449836 kubelet[23706]: E1202 19:27:55.243912   23706 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:55 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:55 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:55 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 481.
	Dec 02 19:27:55 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:55 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (324.42241ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DashboardCmd (1.71s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd (3.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 status
functional_test.go:869: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 status: exit status 2 (315.208321ms)

                                                
                                                
-- stdout --
	functional-449836
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	

                                                
                                                
-- /stdout --
functional_test.go:871: failed to run minikube status. args "out/minikube-linux-arm64 -p functional-449836 status" : exit status 2
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:875: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}: exit status 2 (347.272601ms)

                                                
                                                
-- stdout --
	host:Running,kublet:Running,apiserver:Stopped,kubeconfig:Configured

                                                
                                                
-- /stdout --
functional_test.go:877: failed to run minikube status with custom format: args "out/minikube-linux-arm64 -p functional-449836 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}": exit status 2
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 status -o json
functional_test.go:887: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 status -o json: exit status 2 (303.991769ms)

                                                
                                                
-- stdout --
	{"Name":"functional-449836","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
functional_test.go:889: failed to run minikube status with json output. args "out/minikube-linux-arm64 -p functional-449836 status -o json" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (300.479981ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:255: (dbg) Done: out/minikube-linux-arm64 -p functional-449836 logs -n 25: (1.005829218s)
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                        ARGS                                                                        │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ service │ functional-449836 service list                                                                                                                     │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ service │ functional-449836 service list -o json                                                                                                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ service │ functional-449836 service --namespace=default --https --url hello-node                                                                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ service │ functional-449836 service hello-node --url --format={{.IP}}                                                                                        │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ service │ functional-449836 service hello-node --url                                                                                                         │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount   │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001:/mount-9p --alsologtostderr -v=1             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh     │ functional-449836 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh     │ functional-449836 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh     │ functional-449836 ssh -- ls -la /mount-9p                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh     │ functional-449836 ssh cat /mount-9p/test-1764703664487000285                                                                                       │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh     │ functional-449836 ssh mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh     │ functional-449836 ssh sudo umount -f /mount-9p                                                                                                     │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh     │ functional-449836 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount   │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo265742512/001:/mount-9p --alsologtostderr -v=1 --port 46464 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh     │ functional-449836 ssh findmnt -T /mount-9p | grep 9p                                                                                               │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh     │ functional-449836 ssh -- ls -la /mount-9p                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh     │ functional-449836 ssh sudo umount -f /mount-9p                                                                                                     │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh     │ functional-449836 ssh findmnt -T /mount1                                                                                                           │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount   │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount1 --alsologtostderr -v=1                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount   │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount3 --alsologtostderr -v=1                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount   │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount2 --alsologtostderr -v=1                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh     │ functional-449836 ssh findmnt -T /mount1                                                                                                           │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh     │ functional-449836 ssh findmnt -T /mount2                                                                                                           │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh     │ functional-449836 ssh findmnt -T /mount3                                                                                                           │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ mount   │ -p functional-449836 --kill=true                                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	└─────────┴────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:13:42
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:13:42.762704   54807 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:13:42.762827   54807 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:13:42.762831   54807 out.go:374] Setting ErrFile to fd 2...
	I1202 19:13:42.762834   54807 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:13:42.763078   54807 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:13:42.763410   54807 out.go:368] Setting JSON to false
	I1202 19:13:42.764228   54807 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":3359,"bootTime":1764699464,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:13:42.764287   54807 start.go:143] virtualization:  
	I1202 19:13:42.767748   54807 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:13:42.771595   54807 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:13:42.771638   54807 notify.go:221] Checking for updates...
	I1202 19:13:42.777727   54807 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:13:42.780738   54807 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:13:42.783655   54807 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:13:42.786554   54807 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:13:42.789556   54807 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:13:42.793178   54807 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:13:42.793273   54807 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:13:42.817932   54807 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:13:42.818037   54807 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:13:42.893670   54807 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-02 19:13:42.884370868 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:13:42.893764   54807 docker.go:319] overlay module found
	I1202 19:13:42.896766   54807 out.go:179] * Using the docker driver based on existing profile
	I1202 19:13:42.899559   54807 start.go:309] selected driver: docker
	I1202 19:13:42.899567   54807 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:42.899671   54807 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:13:42.899770   54807 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:13:42.952802   54807 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-02 19:13:42.943962699 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:13:42.953225   54807 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1202 19:13:42.953247   54807 cni.go:84] Creating CNI manager for ""
	I1202 19:13:42.953303   54807 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:13:42.953342   54807 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:42.958183   54807 out.go:179] * Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	I1202 19:13:42.960983   54807 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 19:13:42.963884   54807 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 19:13:42.968058   54807 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:13:42.968252   54807 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 19:13:42.989666   54807 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 19:13:42.989677   54807 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 19:13:43.031045   54807 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 19:13:43.240107   54807 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 19:13:43.240267   54807 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 19:13:43.240445   54807 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240540   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 19:13:43.240557   54807 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 118.031µs
	I1202 19:13:43.240570   54807 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 19:13:43.240584   54807 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240616   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 19:13:43.240621   54807 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 38.835µs
	I1202 19:13:43.240626   54807 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 19:13:43.240809   54807 cache.go:243] Successfully downloaded all kic artifacts
	I1202 19:13:43.240835   54807 start.go:360] acquireMachinesLock for functional-449836: {Name:mk8999fdfa518fc15358d07431fe9bec286a035e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240864   54807 start.go:364] duration metric: took 20.397µs to acquireMachinesLock for "functional-449836"
	I1202 19:13:43.240875   54807 start.go:96] Skipping create...Using existing machine configuration
	I1202 19:13:43.240879   54807 fix.go:54] fixHost starting: 
	I1202 19:13:43.241152   54807 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:13:43.241336   54807 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241393   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 19:13:43.241400   54807 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 69.973µs
	I1202 19:13:43.241406   54807 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 19:13:43.241456   54807 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241496   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 19:13:43.241501   54807 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 46.589µs
	I1202 19:13:43.241506   54807 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 19:13:43.241515   54807 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241539   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 19:13:43.241543   54807 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 29.662µs
	I1202 19:13:43.241548   54807 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 19:13:43.241556   54807 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241581   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 19:13:43.241585   54807 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 29.85µs
	I1202 19:13:43.241589   54807 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 19:13:43.241615   54807 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241641   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 19:13:43.241629   54807 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241645   54807 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 32.345µs
	I1202 19:13:43.241650   54807 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 19:13:43.241693   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 19:13:43.241700   54807 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 86.392µs
	I1202 19:13:43.241706   54807 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 19:13:43.241720   54807 cache.go:87] Successfully saved all images to host disk.
	I1202 19:13:43.258350   54807 fix.go:112] recreateIfNeeded on functional-449836: state=Running err=<nil>
	W1202 19:13:43.258376   54807 fix.go:138] unexpected machine state, will restart: <nil>
	I1202 19:13:43.261600   54807 out.go:252] * Updating the running docker "functional-449836" container ...
	I1202 19:13:43.261627   54807 machine.go:94] provisionDockerMachine start ...
	I1202 19:13:43.261705   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.278805   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.279129   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.279134   54807 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 19:13:43.427938   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:13:43.427951   54807 ubuntu.go:182] provisioning hostname "functional-449836"
	I1202 19:13:43.428028   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.447456   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.447752   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.447759   54807 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-449836 && echo "functional-449836" | sudo tee /etc/hostname
	I1202 19:13:43.605729   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:13:43.605800   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.624976   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.625283   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.625296   54807 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-449836' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-449836/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-449836' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 19:13:43.772540   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 19:13:43.772562   54807 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 19:13:43.772595   54807 ubuntu.go:190] setting up certificates
	I1202 19:13:43.772604   54807 provision.go:84] configureAuth start
	I1202 19:13:43.772671   54807 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:13:43.790248   54807 provision.go:143] copyHostCerts
	I1202 19:13:43.790316   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 19:13:43.790328   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:13:43.790400   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 19:13:43.790504   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 19:13:43.790515   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:13:43.790538   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 19:13:43.790586   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 19:13:43.790589   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:13:43.790610   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 19:13:43.790652   54807 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.functional-449836 san=[127.0.0.1 192.168.49.2 functional-449836 localhost minikube]
	I1202 19:13:43.836362   54807 provision.go:177] copyRemoteCerts
	I1202 19:13:43.836414   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 19:13:43.836453   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.856436   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:43.960942   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 19:13:43.990337   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1202 19:13:44.010316   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 19:13:44.028611   54807 provision.go:87] duration metric: took 255.971492ms to configureAuth
	I1202 19:13:44.028629   54807 ubuntu.go:206] setting minikube options for container-runtime
	I1202 19:13:44.028821   54807 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:13:44.028827   54807 machine.go:97] duration metric: took 767.195405ms to provisionDockerMachine
	I1202 19:13:44.028833   54807 start.go:293] postStartSetup for "functional-449836" (driver="docker")
	I1202 19:13:44.028844   54807 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 19:13:44.028890   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 19:13:44.028937   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.046629   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.156467   54807 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 19:13:44.159958   54807 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 19:13:44.159979   54807 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 19:13:44.159992   54807 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 19:13:44.160053   54807 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 19:13:44.160131   54807 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 19:13:44.160205   54807 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> hosts in /etc/test/nested/copy/4435
	I1202 19:13:44.160247   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4435
	I1202 19:13:44.167846   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:13:44.185707   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts --> /etc/test/nested/copy/4435/hosts (40 bytes)
	I1202 19:13:44.203573   54807 start.go:296] duration metric: took 174.725487ms for postStartSetup
	I1202 19:13:44.203665   54807 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:13:44.203703   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.221082   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.321354   54807 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 19:13:44.325951   54807 fix.go:56] duration metric: took 1.085065634s for fixHost
	I1202 19:13:44.325966   54807 start.go:83] releasing machines lock for "functional-449836", held for 1.08509619s
	I1202 19:13:44.326041   54807 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:13:44.343136   54807 ssh_runner.go:195] Run: cat /version.json
	I1202 19:13:44.343179   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.343439   54807 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 19:13:44.343497   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.361296   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.363895   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.464126   54807 ssh_runner.go:195] Run: systemctl --version
	I1202 19:13:44.557588   54807 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1202 19:13:44.561902   54807 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 19:13:44.561962   54807 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 19:13:44.569598   54807 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1202 19:13:44.569611   54807 start.go:496] detecting cgroup driver to use...
	I1202 19:13:44.569649   54807 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 19:13:44.569710   54807 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 19:13:44.587349   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 19:13:44.609174   54807 docker.go:218] disabling cri-docker service (if available) ...
	I1202 19:13:44.609228   54807 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 19:13:44.629149   54807 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 19:13:44.643983   54807 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 19:13:44.758878   54807 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 19:13:44.879635   54807 docker.go:234] disabling docker service ...
	I1202 19:13:44.879691   54807 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 19:13:44.895449   54807 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 19:13:44.908858   54807 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 19:13:45.045971   54807 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 19:13:45.189406   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 19:13:45.215003   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 19:13:45.239052   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 19:13:45.252425   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 19:13:45.264818   54807 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 19:13:45.264881   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 19:13:45.275398   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:13:45.286201   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 19:13:45.295830   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:13:45.307108   54807 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 19:13:45.315922   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 19:13:45.325735   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 19:13:45.336853   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 19:13:45.346391   54807 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 19:13:45.354212   54807 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 19:13:45.361966   54807 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:13:45.496442   54807 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 19:13:45.617692   54807 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 19:13:45.617755   54807 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 19:13:45.622143   54807 start.go:564] Will wait 60s for crictl version
	I1202 19:13:45.622212   54807 ssh_runner.go:195] Run: which crictl
	I1202 19:13:45.626172   54807 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 19:13:45.650746   54807 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 19:13:45.650812   54807 ssh_runner.go:195] Run: containerd --version
	I1202 19:13:45.670031   54807 ssh_runner.go:195] Run: containerd --version
	I1202 19:13:45.697284   54807 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 19:13:45.700249   54807 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 19:13:45.717142   54807 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1202 19:13:45.724151   54807 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1202 19:13:45.727141   54807 kubeadm.go:884] updating cluster {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 19:13:45.727279   54807 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:13:45.727346   54807 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 19:13:45.751767   54807 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 19:13:45.751786   54807 cache_images.go:86] Images are preloaded, skipping loading
	I1202 19:13:45.751792   54807 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1202 19:13:45.751903   54807 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-449836 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 19:13:45.751976   54807 ssh_runner.go:195] Run: sudo crictl info
	I1202 19:13:45.777030   54807 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1202 19:13:45.777052   54807 cni.go:84] Creating CNI manager for ""
	I1202 19:13:45.777060   54807 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:13:45.777073   54807 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 19:13:45.777095   54807 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-449836 NodeName:functional-449836 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 19:13:45.777203   54807 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-449836"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 19:13:45.777274   54807 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 19:13:45.785000   54807 binaries.go:51] Found k8s binaries, skipping transfer
	I1202 19:13:45.785061   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 19:13:45.792592   54807 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1202 19:13:45.805336   54807 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 19:13:45.818427   54807 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1202 19:13:45.830990   54807 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1202 19:13:45.834935   54807 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:13:45.945402   54807 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:13:46.172299   54807 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836 for IP: 192.168.49.2
	I1202 19:13:46.172311   54807 certs.go:195] generating shared ca certs ...
	I1202 19:13:46.172340   54807 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:13:46.172494   54807 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 19:13:46.172550   54807 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 19:13:46.172557   54807 certs.go:257] generating profile certs ...
	I1202 19:13:46.172651   54807 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key
	I1202 19:13:46.172725   54807 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da
	I1202 19:13:46.172770   54807 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key
	I1202 19:13:46.172876   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 19:13:46.172906   54807 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 19:13:46.172913   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 19:13:46.172944   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 19:13:46.172967   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 19:13:46.172992   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 19:13:46.173034   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:13:46.174236   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 19:13:46.206005   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 19:13:46.223256   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 19:13:46.250390   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 19:13:46.270550   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1202 19:13:46.289153   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 19:13:46.307175   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 19:13:46.325652   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 19:13:46.343823   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 19:13:46.361647   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 19:13:46.379597   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 19:13:46.397750   54807 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 19:13:46.411087   54807 ssh_runner.go:195] Run: openssl version
	I1202 19:13:46.418777   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 19:13:46.427262   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.431022   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.431093   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.473995   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 19:13:46.482092   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 19:13:46.490432   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.494266   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.494320   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.535125   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 19:13:46.543277   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 19:13:46.551769   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.555743   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.555797   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.597778   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 19:13:46.605874   54807 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:13:46.609733   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1202 19:13:46.652482   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1202 19:13:46.693214   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1202 19:13:46.734654   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1202 19:13:46.775729   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1202 19:13:46.821319   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1202 19:13:46.862299   54807 kubeadm.go:401] StartCluster: {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:46.862398   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 19:13:46.862468   54807 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:13:46.891099   54807 cri.go:89] found id: ""
	I1202 19:13:46.891159   54807 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 19:13:46.898813   54807 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1202 19:13:46.898821   54807 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1202 19:13:46.898874   54807 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1202 19:13:46.906272   54807 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:46.906775   54807 kubeconfig.go:125] found "functional-449836" server: "https://192.168.49.2:8441"
	I1202 19:13:46.908038   54807 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1202 19:13:46.915724   54807 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-02 18:59:11.521818114 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-02 19:13:45.826341203 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1202 19:13:46.915744   54807 kubeadm.go:1161] stopping kube-system containers ...
	I1202 19:13:46.915757   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1202 19:13:46.915816   54807 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:13:46.943936   54807 cri.go:89] found id: ""
	I1202 19:13:46.944009   54807 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1202 19:13:46.961843   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:13:46.971074   54807 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Dec  2 19:03 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec  2 19:03 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec  2 19:03 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec  2 19:03 /etc/kubernetes/scheduler.conf
	
	I1202 19:13:46.971137   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:13:46.979452   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:13:46.987399   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:46.987454   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:13:46.994869   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:13:47.002498   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:47.002560   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:13:47.010116   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:13:47.017891   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:47.017946   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:13:47.025383   54807 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 19:13:47.033423   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:47.076377   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.395417   54807 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.319015091s)
	I1202 19:13:48.395495   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.604942   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.668399   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.712382   54807 api_server.go:52] waiting for apiserver process to appear ...
	I1202 19:13:48.712452   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:49.212900   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:49.713354   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:50.213340   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:50.713260   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:51.212621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:51.713471   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:52.213212   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:52.712687   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:53.212572   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:53.713310   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:54.212640   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:54.712595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:55.213133   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:55.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:56.212595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:56.713443   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:57.213230   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:57.713055   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:58.213071   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:58.712680   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:59.213352   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:59.712654   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:00.213647   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:00.712569   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:01.212673   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:01.713030   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:02.212581   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:02.712631   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:03.213287   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:03.712572   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:04.213500   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:04.713557   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:05.213523   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:05.713480   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:06.212772   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:06.713553   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:07.213309   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:07.712616   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:08.212729   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:08.713403   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:09.212625   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:09.713385   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:10.212662   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:10.712619   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:11.213505   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:11.712640   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:12.213396   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:12.712571   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:13.212963   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:13.713403   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:14.213457   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:14.712620   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:15.213335   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:15.713379   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:16.212612   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:16.712624   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:17.212573   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:17.713394   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:18.213294   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:18.713516   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:19.213531   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:19.713309   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:20.212591   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:20.713575   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:21.213560   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:21.713513   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:22.213219   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:22.712620   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:23.213273   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:23.713477   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:24.213364   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:24.712581   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:25.212597   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:25.713554   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:26.213205   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:26.712517   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:27.213345   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:27.712602   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:28.212602   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:28.713533   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:29.213188   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:29.713102   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:30.212626   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:30.712732   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:31.212615   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:31.713473   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:32.212590   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:32.712645   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:33.213398   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:33.713081   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:34.213498   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:34.712625   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:35.213560   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:35.712634   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:36.213370   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:36.712576   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:37.213006   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:37.712656   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:38.212594   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:38.713448   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:39.213442   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:39.712577   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:40.212621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:40.713516   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:41.212756   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:41.712509   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:42.215715   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:42.712573   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:43.212604   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:43.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:44.213283   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:44.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:45.213407   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:45.712947   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:46.213239   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:46.712626   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:47.213260   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:47.713210   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:48.212639   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:48.713264   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:48.713347   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:48.742977   54807 cri.go:89] found id: ""
	I1202 19:14:48.742990   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.742997   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:48.743002   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:48.743061   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:48.767865   54807 cri.go:89] found id: ""
	I1202 19:14:48.767879   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.767886   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:48.767892   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:48.767949   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:48.792531   54807 cri.go:89] found id: ""
	I1202 19:14:48.792544   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.792560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:48.792566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:48.792624   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:48.821644   54807 cri.go:89] found id: ""
	I1202 19:14:48.821657   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.821665   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:48.821670   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:48.821729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:48.847227   54807 cri.go:89] found id: ""
	I1202 19:14:48.847246   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.847253   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:48.847258   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:48.847318   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:48.872064   54807 cri.go:89] found id: ""
	I1202 19:14:48.872084   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.872091   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:48.872097   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:48.872155   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:48.895905   54807 cri.go:89] found id: ""
	I1202 19:14:48.895919   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.895925   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:48.895933   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:48.895945   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:48.962492   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:48.954400   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.954981   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.956769   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.957381   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.959046   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:48.954400   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.954981   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.956769   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.957381   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.959046   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:48.962515   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:48.962526   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:49.026861   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:49.026881   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:49.059991   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:49.060006   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:49.119340   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:49.119357   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:51.632315   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:51.642501   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:51.642560   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:51.669041   54807 cri.go:89] found id: ""
	I1202 19:14:51.669054   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.669061   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:51.669086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:51.669150   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:51.698828   54807 cri.go:89] found id: ""
	I1202 19:14:51.698857   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.698864   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:51.698870   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:51.698939   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:51.739419   54807 cri.go:89] found id: ""
	I1202 19:14:51.739446   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.739454   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:51.739459   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:51.739532   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:51.764613   54807 cri.go:89] found id: ""
	I1202 19:14:51.764627   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.764633   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:51.764639   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:51.764698   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:51.790197   54807 cri.go:89] found id: ""
	I1202 19:14:51.790211   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.790217   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:51.790222   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:51.790281   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:51.824131   54807 cri.go:89] found id: ""
	I1202 19:14:51.824144   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.824151   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:51.824170   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:51.824228   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:51.848893   54807 cri.go:89] found id: ""
	I1202 19:14:51.848907   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.848914   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:51.848922   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:51.848932   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:51.877099   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:51.877114   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:51.933539   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:51.933560   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:51.944309   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:51.944346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:52.014156   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:52.005976   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.006753   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.008549   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.009096   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.010706   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:52.005976   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.006753   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.008549   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.009096   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.010706   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:52.014167   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:52.014178   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:54.578451   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:54.588802   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:54.588862   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:54.613620   54807 cri.go:89] found id: ""
	I1202 19:14:54.613633   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.613640   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:54.613646   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:54.613704   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:54.637471   54807 cri.go:89] found id: ""
	I1202 19:14:54.637486   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.637498   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:54.637503   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:54.637561   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:54.662053   54807 cri.go:89] found id: ""
	I1202 19:14:54.662066   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.662073   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:54.662079   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:54.662135   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:54.694901   54807 cri.go:89] found id: ""
	I1202 19:14:54.694916   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.694923   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:54.694928   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:54.694998   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:54.728487   54807 cri.go:89] found id: ""
	I1202 19:14:54.728500   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.728507   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:54.728512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:54.728569   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:54.756786   54807 cri.go:89] found id: ""
	I1202 19:14:54.756800   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.756806   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:54.756812   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:54.756868   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:54.782187   54807 cri.go:89] found id: ""
	I1202 19:14:54.782200   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.782212   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:54.782220   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:54.782231   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:54.846497   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:54.838849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.839509   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.840990   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.841320   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.842849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:54.838849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.839509   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.840990   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.841320   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.842849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:54.846510   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:54.846521   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:54.909600   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:54.909620   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:54.943132   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:54.943150   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:55.006561   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:55.006581   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:57.519164   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:57.529445   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:57.529506   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:57.554155   54807 cri.go:89] found id: ""
	I1202 19:14:57.554168   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.554176   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:57.554181   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:57.554240   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:57.579453   54807 cri.go:89] found id: ""
	I1202 19:14:57.579468   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.579474   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:57.579480   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:57.579537   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:57.608139   54807 cri.go:89] found id: ""
	I1202 19:14:57.608152   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.608160   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:57.608165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:57.608224   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:57.632309   54807 cri.go:89] found id: ""
	I1202 19:14:57.632360   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.632368   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:57.632374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:57.632434   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:57.657933   54807 cri.go:89] found id: ""
	I1202 19:14:57.657947   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.657954   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:57.657959   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:57.658019   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:57.698982   54807 cri.go:89] found id: ""
	I1202 19:14:57.698996   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.699002   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:57.699008   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:57.699105   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:57.738205   54807 cri.go:89] found id: ""
	I1202 19:14:57.738219   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.738226   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:57.738234   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:57.738245   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:57.802193   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:57.794304   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.794844   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796409   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796881   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.798306   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:57.794304   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.794844   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796409   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796881   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.798306   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:57.802204   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:57.802215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:57.865638   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:57.865657   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:57.900835   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:57.900850   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:57.958121   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:57.958139   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:00.502580   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:00.515602   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:00.515692   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:00.553262   54807 cri.go:89] found id: ""
	I1202 19:15:00.553290   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.553298   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:00.553304   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:00.553372   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:00.592663   54807 cri.go:89] found id: ""
	I1202 19:15:00.592678   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.592686   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:00.592691   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:00.592782   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:00.624403   54807 cri.go:89] found id: ""
	I1202 19:15:00.624423   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.624431   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:00.624438   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:00.624521   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:00.659265   54807 cri.go:89] found id: ""
	I1202 19:15:00.659280   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.659288   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:00.659294   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:00.659383   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:00.695489   54807 cri.go:89] found id: ""
	I1202 19:15:00.695508   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.695517   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:00.695523   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:00.695592   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:00.732577   54807 cri.go:89] found id: ""
	I1202 19:15:00.732592   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.732600   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:00.732607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:00.732696   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:00.767521   54807 cri.go:89] found id: ""
	I1202 19:15:00.767538   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.767546   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:00.767555   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:00.767566   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:00.829818   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:00.829837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:00.842792   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:00.842810   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:00.919161   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:00.909398   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.910484   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.912564   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.913381   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.915298   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:00.909398   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.910484   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.912564   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.913381   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.915298   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:00.919174   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:00.919193   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:00.985798   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:00.985819   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:03.521258   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:03.531745   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:03.531810   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:03.556245   54807 cri.go:89] found id: ""
	I1202 19:15:03.556258   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.556265   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:03.556271   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:03.556355   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:03.580774   54807 cri.go:89] found id: ""
	I1202 19:15:03.580787   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.580794   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:03.580799   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:03.580857   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:03.606247   54807 cri.go:89] found id: ""
	I1202 19:15:03.606261   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.606269   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:03.606274   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:03.606335   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:03.631169   54807 cri.go:89] found id: ""
	I1202 19:15:03.631182   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.631189   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:03.631195   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:03.631252   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:03.657089   54807 cri.go:89] found id: ""
	I1202 19:15:03.657111   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.657118   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:03.657124   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:03.657183   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:03.699997   54807 cri.go:89] found id: ""
	I1202 19:15:03.700010   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.700017   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:03.700023   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:03.700081   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:03.725717   54807 cri.go:89] found id: ""
	I1202 19:15:03.725731   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.725738   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:03.725746   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:03.725755   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:03.793907   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:03.793928   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:03.822178   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:03.822199   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:03.881429   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:03.881453   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:03.892554   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:03.892569   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:03.960792   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:03.952836   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.953779   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955515   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955829   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.957393   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:03.952836   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.953779   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955515   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955829   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.957393   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:06.461036   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:06.471459   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:06.471519   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:06.500164   54807 cri.go:89] found id: ""
	I1202 19:15:06.500178   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.500184   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:06.500190   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:06.500253   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:06.526532   54807 cri.go:89] found id: ""
	I1202 19:15:06.526545   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.526552   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:06.526558   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:06.526616   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:06.551534   54807 cri.go:89] found id: ""
	I1202 19:15:06.551553   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.551560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:06.551566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:06.551628   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:06.577486   54807 cri.go:89] found id: ""
	I1202 19:15:06.577500   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.577506   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:06.577512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:06.577570   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:06.607506   54807 cri.go:89] found id: ""
	I1202 19:15:06.607520   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.607529   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:06.607535   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:06.607663   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:06.632779   54807 cri.go:89] found id: ""
	I1202 19:15:06.632792   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.632799   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:06.632805   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:06.632866   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:06.656916   54807 cri.go:89] found id: ""
	I1202 19:15:06.656928   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.656936   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:06.656943   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:06.656953   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:06.721178   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:06.721197   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:06.733421   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:06.733437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:06.806706   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:06.798437   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.799136   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.800799   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.801507   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.803118   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:06.798437   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.799136   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.800799   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.801507   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.803118   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:06.806717   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:06.806728   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:06.870452   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:06.870471   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:09.403297   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:09.414259   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:09.414319   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:09.442090   54807 cri.go:89] found id: ""
	I1202 19:15:09.442103   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.442110   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:09.442115   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:09.442175   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:09.471784   54807 cri.go:89] found id: ""
	I1202 19:15:09.471797   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.471804   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:09.471809   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:09.471887   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:09.496688   54807 cri.go:89] found id: ""
	I1202 19:15:09.496701   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.496708   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:09.496714   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:09.496773   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:09.522932   54807 cri.go:89] found id: ""
	I1202 19:15:09.522946   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.522952   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:09.522957   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:09.523018   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:09.550254   54807 cri.go:89] found id: ""
	I1202 19:15:09.550268   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.550275   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:09.550280   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:09.550341   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:09.578955   54807 cri.go:89] found id: ""
	I1202 19:15:09.578968   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.578975   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:09.578980   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:09.579041   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:09.603797   54807 cri.go:89] found id: ""
	I1202 19:15:09.603812   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.603819   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:09.603827   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:09.603837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:09.660195   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:09.660215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:09.671581   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:09.671596   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:09.755982   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:09.748446   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.749204   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.750900   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.751203   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.752674   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:09.748446   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.749204   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.750900   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.751203   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.752674   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:09.755993   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:09.756013   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:09.820958   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:09.820977   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:12.349982   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:12.359890   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:12.359953   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:12.387716   54807 cri.go:89] found id: ""
	I1202 19:15:12.387729   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.387736   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:12.387741   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:12.387802   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:12.413168   54807 cri.go:89] found id: ""
	I1202 19:15:12.413182   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.413188   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:12.413194   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:12.413262   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:12.441234   54807 cri.go:89] found id: ""
	I1202 19:15:12.441247   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.441253   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:12.441262   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:12.441321   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:12.465660   54807 cri.go:89] found id: ""
	I1202 19:15:12.465673   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.465680   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:12.465689   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:12.465747   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:12.489519   54807 cri.go:89] found id: ""
	I1202 19:15:12.489532   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.489540   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:12.489545   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:12.489605   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:12.514756   54807 cri.go:89] found id: ""
	I1202 19:15:12.514770   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.514777   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:12.514782   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:12.514843   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:12.538845   54807 cri.go:89] found id: ""
	I1202 19:15:12.538858   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.538865   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:12.538872   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:12.538884   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:12.549453   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:12.549477   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:12.616294   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:12.608411   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.609095   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.610814   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.611359   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.612794   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:12.608411   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.609095   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.610814   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.611359   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.612794   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:12.616304   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:12.616315   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:12.679579   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:12.679598   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:12.712483   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:12.712499   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:15.277003   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:15.287413   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:15.287496   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:15.313100   54807 cri.go:89] found id: ""
	I1202 19:15:15.313113   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.313120   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:15.313135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:15.313194   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:15.339367   54807 cri.go:89] found id: ""
	I1202 19:15:15.339381   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.339387   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:15.339393   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:15.339463   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:15.364247   54807 cri.go:89] found id: ""
	I1202 19:15:15.364270   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.364277   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:15.364283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:15.364393   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:15.389379   54807 cri.go:89] found id: ""
	I1202 19:15:15.389393   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.389401   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:15.389412   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:15.389472   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:15.414364   54807 cri.go:89] found id: ""
	I1202 19:15:15.414378   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.414386   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:15.414391   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:15.414455   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:15.438995   54807 cri.go:89] found id: ""
	I1202 19:15:15.439009   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.439024   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:15.439030   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:15.439097   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:15.467973   54807 cri.go:89] found id: ""
	I1202 19:15:15.467986   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.467993   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:15.468001   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:15.468010   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:15.534212   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:15.526543   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.527142   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.528724   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.529277   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.530859   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:15.526543   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.527142   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.528724   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.529277   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.530859   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:15.534222   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:15.534233   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:15.602898   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:15.602917   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:15.634225   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:15.634242   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:15.693229   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:15.693247   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:18.205585   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:18.217019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:18.217080   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:18.243139   54807 cri.go:89] found id: ""
	I1202 19:15:18.243153   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.243160   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:18.243176   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:18.243234   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:18.266826   54807 cri.go:89] found id: ""
	I1202 19:15:18.266839   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.266846   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:18.266851   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:18.266911   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:18.291760   54807 cri.go:89] found id: ""
	I1202 19:15:18.291773   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.291781   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:18.291795   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:18.291853   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:18.315881   54807 cri.go:89] found id: ""
	I1202 19:15:18.315895   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.315902   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:18.315907   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:18.315963   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:18.354620   54807 cri.go:89] found id: ""
	I1202 19:15:18.354633   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.354640   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:18.354649   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:18.354708   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:18.378919   54807 cri.go:89] found id: ""
	I1202 19:15:18.378932   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.378939   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:18.378945   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:18.379003   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:18.403461   54807 cri.go:89] found id: ""
	I1202 19:15:18.403474   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.403482   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:18.403489   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:18.403499   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:18.460043   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:18.460062   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:18.471326   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:18.471343   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:18.533325   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:18.525005   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.525603   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527360   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527971   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.529742   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:18.525005   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.525603   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527360   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527971   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.529742   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:18.533335   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:18.533346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:18.595843   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:18.595862   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:21.128472   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:21.138623   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:21.138683   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:21.163008   54807 cri.go:89] found id: ""
	I1202 19:15:21.163021   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.163028   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:21.163039   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:21.163096   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:21.186917   54807 cri.go:89] found id: ""
	I1202 19:15:21.186930   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.186937   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:21.186942   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:21.187000   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:21.212853   54807 cri.go:89] found id: ""
	I1202 19:15:21.212866   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.212873   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:21.212878   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:21.212937   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:21.240682   54807 cri.go:89] found id: ""
	I1202 19:15:21.240695   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.240703   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:21.240708   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:21.240765   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:21.264693   54807 cri.go:89] found id: ""
	I1202 19:15:21.264706   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.264713   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:21.264718   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:21.264778   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:21.288193   54807 cri.go:89] found id: ""
	I1202 19:15:21.288207   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.288214   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:21.288219   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:21.288278   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:21.313950   54807 cri.go:89] found id: ""
	I1202 19:15:21.313964   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.313971   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:21.313979   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:21.313990   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:21.324612   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:21.324626   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:21.388157   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:21.380313   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.381141   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.382761   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.383081   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.384783   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:21.380313   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.381141   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.382761   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.383081   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.384783   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:21.388177   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:21.388188   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:21.451835   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:21.451853   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:21.480172   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:21.480187   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:24.037107   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:24.047291   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:24.047362   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:24.072397   54807 cri.go:89] found id: ""
	I1202 19:15:24.072411   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.072418   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:24.072424   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:24.072486   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:24.097793   54807 cri.go:89] found id: ""
	I1202 19:15:24.097807   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.097814   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:24.097819   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:24.097879   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:24.122934   54807 cri.go:89] found id: ""
	I1202 19:15:24.122947   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.122954   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:24.122960   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:24.123020   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:24.147849   54807 cri.go:89] found id: ""
	I1202 19:15:24.147863   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.147869   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:24.147875   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:24.147935   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:24.172919   54807 cri.go:89] found id: ""
	I1202 19:15:24.172932   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.172939   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:24.172944   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:24.173004   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:24.197266   54807 cri.go:89] found id: ""
	I1202 19:15:24.197280   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.197287   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:24.197293   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:24.197351   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:24.222541   54807 cri.go:89] found id: ""
	I1202 19:15:24.222555   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.222562   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:24.222572   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:24.222582   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:24.278762   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:24.278784   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:24.289861   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:24.289877   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:24.353810   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:24.345889   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.346412   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.347992   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.348533   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.350299   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:24.345889   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.346412   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.347992   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.348533   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.350299   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:24.353831   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:24.353842   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:24.416010   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:24.416029   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:26.947462   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:26.958975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:26.959033   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:26.992232   54807 cri.go:89] found id: ""
	I1202 19:15:26.992257   54807 logs.go:282] 0 containers: []
	W1202 19:15:26.992264   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:26.992270   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:26.992354   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:27.021036   54807 cri.go:89] found id: ""
	I1202 19:15:27.021049   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.021056   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:27.021062   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:27.021119   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:27.052008   54807 cri.go:89] found id: ""
	I1202 19:15:27.052022   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.052028   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:27.052034   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:27.052093   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:27.076184   54807 cri.go:89] found id: ""
	I1202 19:15:27.076197   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.076204   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:27.076209   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:27.076266   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:27.100296   54807 cri.go:89] found id: ""
	I1202 19:15:27.100308   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.100315   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:27.100355   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:27.100413   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:27.125762   54807 cri.go:89] found id: ""
	I1202 19:15:27.125776   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.125783   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:27.125788   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:27.125851   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:27.150224   54807 cri.go:89] found id: ""
	I1202 19:15:27.150237   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.150244   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:27.150252   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:27.150262   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:27.178321   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:27.178338   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:27.233465   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:27.233484   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:27.244423   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:27.244437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:27.311220   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:27.303476   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.303903   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.305730   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.306227   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.307716   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:27.303476   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.303903   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.305730   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.306227   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.307716   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:27.311235   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:27.311246   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:29.874091   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:29.884341   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:29.884402   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:29.909943   54807 cri.go:89] found id: ""
	I1202 19:15:29.909962   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.909970   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:29.909975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:29.910035   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:29.947534   54807 cri.go:89] found id: ""
	I1202 19:15:29.947547   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.947554   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:29.947559   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:29.947617   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:29.989319   54807 cri.go:89] found id: ""
	I1202 19:15:29.989335   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.989343   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:29.989349   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:29.989414   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:30.038828   54807 cri.go:89] found id: ""
	I1202 19:15:30.038842   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.038850   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:30.038856   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:30.038932   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:30.067416   54807 cri.go:89] found id: ""
	I1202 19:15:30.067432   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.067440   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:30.067446   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:30.067509   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:30.094866   54807 cri.go:89] found id: ""
	I1202 19:15:30.094881   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.094888   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:30.094896   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:30.094958   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:30.120930   54807 cri.go:89] found id: ""
	I1202 19:15:30.120959   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.120968   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:30.120977   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:30.120988   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:30.177165   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:30.177186   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:30.188251   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:30.188267   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:30.255176   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:30.247015   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.247757   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.249382   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.250104   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.251678   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:30.247015   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.247757   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.249382   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.250104   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.251678   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:30.255194   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:30.255205   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:30.323165   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:30.323189   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:32.854201   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:32.864404   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:32.864467   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:32.890146   54807 cri.go:89] found id: ""
	I1202 19:15:32.890160   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.890166   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:32.890172   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:32.890239   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:32.915189   54807 cri.go:89] found id: ""
	I1202 19:15:32.915202   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.915210   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:32.915215   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:32.915286   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:32.952949   54807 cri.go:89] found id: ""
	I1202 19:15:32.952962   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.952969   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:32.952975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:32.953031   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:32.986345   54807 cri.go:89] found id: ""
	I1202 19:15:32.986359   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.986366   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:32.986371   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:32.986435   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:33.010880   54807 cri.go:89] found id: ""
	I1202 19:15:33.010894   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.010902   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:33.010908   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:33.010966   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:33.039327   54807 cri.go:89] found id: ""
	I1202 19:15:33.039341   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.039348   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:33.039354   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:33.039412   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:33.064437   54807 cri.go:89] found id: ""
	I1202 19:15:33.064463   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.064470   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:33.064478   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:33.064488   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:33.120755   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:33.120773   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:33.132552   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:33.132575   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:33.199378   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:33.191204   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.191873   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.193517   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.194121   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.195812   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:33.191204   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.191873   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.193517   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.194121   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.195812   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:33.199389   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:33.199401   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:33.266899   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:33.266918   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:35.796024   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:35.807086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:35.807146   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:35.839365   54807 cri.go:89] found id: ""
	I1202 19:15:35.839378   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.839394   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:35.839400   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:35.839469   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:35.872371   54807 cri.go:89] found id: ""
	I1202 19:15:35.872385   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.872393   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:35.872398   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:35.872467   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:35.901242   54807 cri.go:89] found id: ""
	I1202 19:15:35.901255   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.901262   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:35.901268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:35.901326   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:35.936195   54807 cri.go:89] found id: ""
	I1202 19:15:35.936209   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.936215   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:35.936221   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:35.936282   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:35.965129   54807 cri.go:89] found id: ""
	I1202 19:15:35.965145   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.965153   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:35.965159   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:35.966675   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:35.998286   54807 cri.go:89] found id: ""
	I1202 19:15:35.998299   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.998306   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:35.998311   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:35.998371   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:36.024787   54807 cri.go:89] found id: ""
	I1202 19:15:36.024800   54807 logs.go:282] 0 containers: []
	W1202 19:15:36.024812   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:36.024820   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:36.024829   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:36.081130   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:36.081146   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:36.092692   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:36.092714   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:36.154814   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:36.146918   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.147704   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149430   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149884   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.151372   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:36.146918   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.147704   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149430   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149884   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.151372   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:36.154824   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:36.154837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:36.218034   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:36.218052   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:38.748085   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:38.758270   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:38.758328   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:38.786304   54807 cri.go:89] found id: ""
	I1202 19:15:38.786317   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.786325   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:38.786330   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:38.786389   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:38.811113   54807 cri.go:89] found id: ""
	I1202 19:15:38.811126   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.811134   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:38.811139   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:38.811223   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:38.836191   54807 cri.go:89] found id: ""
	I1202 19:15:38.836207   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.836214   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:38.836219   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:38.836278   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:38.860383   54807 cri.go:89] found id: ""
	I1202 19:15:38.860396   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.860403   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:38.860410   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:38.860469   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:38.887750   54807 cri.go:89] found id: ""
	I1202 19:15:38.887764   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.887770   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:38.887775   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:38.887834   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:38.914103   54807 cri.go:89] found id: ""
	I1202 19:15:38.914116   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.914123   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:38.914128   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:38.914184   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:38.950405   54807 cri.go:89] found id: ""
	I1202 19:15:38.950418   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.950425   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:38.950433   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:38.950442   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:39.016206   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:39.016225   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:39.026699   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:39.026714   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:39.090183   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:39.082441   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.083071   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.084892   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.085258   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.086741   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:39.082441   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.083071   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.084892   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.085258   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.086741   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:39.090195   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:39.090206   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:39.151533   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:39.151551   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:41.681058   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:41.691353   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:41.691417   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:41.716684   54807 cri.go:89] found id: ""
	I1202 19:15:41.716697   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.716704   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:41.716710   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:41.716768   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:41.742096   54807 cri.go:89] found id: ""
	I1202 19:15:41.742110   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.742117   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:41.742122   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:41.742182   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:41.766652   54807 cri.go:89] found id: ""
	I1202 19:15:41.766665   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.766672   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:41.766678   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:41.766741   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:41.791517   54807 cri.go:89] found id: ""
	I1202 19:15:41.791531   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.791538   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:41.791544   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:41.791600   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:41.817700   54807 cri.go:89] found id: ""
	I1202 19:15:41.817713   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.817720   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:41.817725   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:41.817786   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:41.846078   54807 cri.go:89] found id: ""
	I1202 19:15:41.846092   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.846099   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:41.846104   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:41.846161   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:41.874235   54807 cri.go:89] found id: ""
	I1202 19:15:41.874249   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.874258   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:41.874268   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:41.874278   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:41.942286   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:41.942307   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:41.989723   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:41.989740   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:42.047707   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:42.047728   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:42.061053   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:42.061073   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:42.138885   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:42.129369   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.130125   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.131195   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.132151   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.133038   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:42.129369   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.130125   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.131195   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.132151   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.133038   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:44.639103   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:44.648984   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:44.649044   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:44.673076   54807 cri.go:89] found id: ""
	I1202 19:15:44.673091   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.673098   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:44.673105   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:44.673162   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:44.696488   54807 cri.go:89] found id: ""
	I1202 19:15:44.696501   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.696507   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:44.696512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:44.696568   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:44.722164   54807 cri.go:89] found id: ""
	I1202 19:15:44.722177   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.722184   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:44.722190   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:44.722254   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:44.745410   54807 cri.go:89] found id: ""
	I1202 19:15:44.745424   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.745431   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:44.745437   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:44.745494   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:44.769317   54807 cri.go:89] found id: ""
	I1202 19:15:44.769330   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.769337   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:44.769342   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:44.769404   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:44.794282   54807 cri.go:89] found id: ""
	I1202 19:15:44.794295   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.794302   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:44.794308   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:44.794369   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:44.818676   54807 cri.go:89] found id: ""
	I1202 19:15:44.818689   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.818696   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:44.818703   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:44.818734   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:44.829491   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:44.829506   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:44.892401   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:44.884881   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.885617   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887285   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887590   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.889113   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:44.884881   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.885617   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887285   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887590   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.889113   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:44.892427   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:44.892438   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:44.961436   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:44.961457   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:45.004301   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:45.004340   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:47.597359   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:47.607380   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:47.607436   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:47.632361   54807 cri.go:89] found id: ""
	I1202 19:15:47.632375   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.632382   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:47.632387   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:47.632443   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:47.657478   54807 cri.go:89] found id: ""
	I1202 19:15:47.657491   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.657498   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:47.657504   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:47.657565   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:47.681973   54807 cri.go:89] found id: ""
	I1202 19:15:47.681987   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.681994   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:47.681999   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:47.682054   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:47.705968   54807 cri.go:89] found id: ""
	I1202 19:15:47.705982   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.705988   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:47.705994   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:47.706051   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:47.730910   54807 cri.go:89] found id: ""
	I1202 19:15:47.730923   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.730930   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:47.730935   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:47.730992   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:47.757739   54807 cri.go:89] found id: ""
	I1202 19:15:47.757752   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.757759   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:47.757764   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:47.757820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:47.782566   54807 cri.go:89] found id: ""
	I1202 19:15:47.782579   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.782586   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:47.782594   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:47.782605   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:47.845974   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:47.837753   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.838402   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840192   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840777   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.842546   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:47.837753   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.838402   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840192   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840777   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.842546   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:47.845983   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:47.845994   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:47.913035   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:47.913054   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:47.952076   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:47.952091   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:48.023577   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:48.023596   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:50.534902   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:50.544843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:50.544904   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:50.573435   54807 cri.go:89] found id: ""
	I1202 19:15:50.573449   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.573456   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:50.573462   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:50.573524   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:50.598029   54807 cri.go:89] found id: ""
	I1202 19:15:50.598043   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.598051   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:50.598056   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:50.598115   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:50.623452   54807 cri.go:89] found id: ""
	I1202 19:15:50.623465   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.623472   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:50.623478   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:50.623536   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:50.648357   54807 cri.go:89] found id: ""
	I1202 19:15:50.648371   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.648378   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:50.648383   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:50.648441   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:50.672042   54807 cri.go:89] found id: ""
	I1202 19:15:50.672056   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.672063   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:50.672068   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:50.672125   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:50.697434   54807 cri.go:89] found id: ""
	I1202 19:15:50.697448   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.697455   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:50.697461   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:50.697525   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:50.728291   54807 cri.go:89] found id: ""
	I1202 19:15:50.728305   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.728312   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:50.728340   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:50.728351   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:50.790193   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:50.781764   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.782494   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784122   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784535   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.786186   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:50.781764   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.782494   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784122   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784535   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.786186   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:50.790203   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:50.790214   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:50.855933   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:50.855951   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:50.884682   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:50.884698   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:50.949404   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:50.949423   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:53.461440   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:53.471831   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:53.471906   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:53.496591   54807 cri.go:89] found id: ""
	I1202 19:15:53.496604   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.496611   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:53.496617   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:53.496674   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:53.521087   54807 cri.go:89] found id: ""
	I1202 19:15:53.521103   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.521111   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:53.521116   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:53.521174   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:53.545148   54807 cri.go:89] found id: ""
	I1202 19:15:53.545161   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.545168   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:53.545173   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:53.545231   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:53.570884   54807 cri.go:89] found id: ""
	I1202 19:15:53.570898   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.570904   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:53.570910   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:53.570972   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:53.597220   54807 cri.go:89] found id: ""
	I1202 19:15:53.597234   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.597241   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:53.597247   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:53.597326   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:53.626817   54807 cri.go:89] found id: ""
	I1202 19:15:53.626830   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.626837   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:53.626843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:53.626901   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:53.656721   54807 cri.go:89] found id: ""
	I1202 19:15:53.656734   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.656741   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:53.656750   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:53.656762   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:53.721841   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:53.712824   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.713681   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715369   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715912   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.717503   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:53.712824   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.713681   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715369   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715912   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.717503   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:53.721850   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:53.721862   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:53.785783   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:53.785801   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:53.815658   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:53.815673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:53.873221   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:53.873238   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:56.384447   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:56.394843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:56.394909   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:56.425129   54807 cri.go:89] found id: ""
	I1202 19:15:56.425142   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.425149   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:56.425154   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:56.425212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:56.451236   54807 cri.go:89] found id: ""
	I1202 19:15:56.451250   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.451257   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:56.451263   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:56.451327   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:56.476585   54807 cri.go:89] found id: ""
	I1202 19:15:56.476599   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.476606   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:56.476611   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:56.476669   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:56.501814   54807 cri.go:89] found id: ""
	I1202 19:15:56.501828   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.501834   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:56.501840   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:56.501900   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:56.530866   54807 cri.go:89] found id: ""
	I1202 19:15:56.530879   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.530886   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:56.530891   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:56.530959   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:56.555014   54807 cri.go:89] found id: ""
	I1202 19:15:56.555029   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.555036   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:56.555042   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:56.555102   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:56.582644   54807 cri.go:89] found id: ""
	I1202 19:15:56.582657   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.582664   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:56.582672   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:56.582684   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:56.637937   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:56.637955   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:56.648656   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:56.648672   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:56.716929   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:56.708385   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.708981   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.710802   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.711377   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.712990   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:56.708385   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.708981   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.710802   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.711377   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.712990   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:56.716939   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:56.716950   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:56.783854   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:56.783880   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:59.312498   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:59.322671   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:59.322730   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:59.346425   54807 cri.go:89] found id: ""
	I1202 19:15:59.346439   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.346446   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:59.346452   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:59.346515   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:59.371199   54807 cri.go:89] found id: ""
	I1202 19:15:59.371212   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.371219   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:59.371224   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:59.371286   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:59.398444   54807 cri.go:89] found id: ""
	I1202 19:15:59.398458   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.398465   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:59.398470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:59.398528   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:59.423109   54807 cri.go:89] found id: ""
	I1202 19:15:59.423122   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.423129   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:59.423135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:59.423193   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:59.448440   54807 cri.go:89] found id: ""
	I1202 19:15:59.448454   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.448461   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:59.448469   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:59.448539   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:59.472288   54807 cri.go:89] found id: ""
	I1202 19:15:59.472302   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.472309   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:59.472315   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:59.472396   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:59.501959   54807 cri.go:89] found id: ""
	I1202 19:15:59.501973   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.501980   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:59.501987   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:59.501999   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:59.562783   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:59.555099   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.555718   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557259   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557814   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.559465   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:59.555099   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.555718   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557259   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557814   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.559465   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:59.562800   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:59.562811   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:59.626612   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:59.626631   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:59.655068   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:59.655083   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:59.713332   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:59.713350   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:02.224451   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:02.234704   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:02.234765   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:02.260861   54807 cri.go:89] found id: ""
	I1202 19:16:02.260875   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.260882   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:02.260888   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:02.260951   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:02.286334   54807 cri.go:89] found id: ""
	I1202 19:16:02.286354   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.286362   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:02.286367   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:02.286426   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:02.310961   54807 cri.go:89] found id: ""
	I1202 19:16:02.310975   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.310982   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:02.310988   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:02.311050   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:02.339645   54807 cri.go:89] found id: ""
	I1202 19:16:02.339658   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.339665   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:02.339670   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:02.339727   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:02.364456   54807 cri.go:89] found id: ""
	I1202 19:16:02.364471   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.364478   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:02.364484   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:02.364547   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:02.394258   54807 cri.go:89] found id: ""
	I1202 19:16:02.394272   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.394278   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:02.394284   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:02.394342   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:02.418723   54807 cri.go:89] found id: ""
	I1202 19:16:02.418737   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.418744   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:02.418752   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:02.418762   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:02.482679   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:02.474517   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.475378   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.476933   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.477486   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.479002   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:02.474517   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.475378   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.476933   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.477486   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.479002   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:02.482690   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:02.482700   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:02.548276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:02.548295   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:02.578369   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:02.578386   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:02.636563   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:02.636581   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:05.147857   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:05.158273   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:05.158332   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:05.198133   54807 cri.go:89] found id: ""
	I1202 19:16:05.198149   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.198161   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:05.198167   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:05.198230   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:05.229481   54807 cri.go:89] found id: ""
	I1202 19:16:05.229494   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.229508   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:05.229513   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:05.229573   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:05.255940   54807 cri.go:89] found id: ""
	I1202 19:16:05.255954   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.255961   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:05.255967   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:05.256027   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:05.281978   54807 cri.go:89] found id: ""
	I1202 19:16:05.281991   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.281998   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:05.282004   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:05.282063   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:05.310511   54807 cri.go:89] found id: ""
	I1202 19:16:05.310525   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.310533   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:05.310539   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:05.310605   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:05.340114   54807 cri.go:89] found id: ""
	I1202 19:16:05.340127   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.340135   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:05.340140   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:05.340198   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:05.366243   54807 cri.go:89] found id: ""
	I1202 19:16:05.366256   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.366263   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:05.366271   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:05.366283   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:05.393993   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:05.394009   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:05.450279   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:05.450299   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:05.461585   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:05.461602   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:05.528601   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:05.520245   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.521019   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.522739   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.523260   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.524914   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:05.520245   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.521019   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.522739   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.523260   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.524914   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:05.528610   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:05.528621   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:08.097252   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:08.107731   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:08.107792   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:08.134215   54807 cri.go:89] found id: ""
	I1202 19:16:08.134240   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.134248   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:08.134255   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:08.134327   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:08.160174   54807 cri.go:89] found id: ""
	I1202 19:16:08.160188   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.160195   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:08.160200   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:08.160259   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:08.188835   54807 cri.go:89] found id: ""
	I1202 19:16:08.188849   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.188856   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:08.188871   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:08.188930   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:08.222672   54807 cri.go:89] found id: ""
	I1202 19:16:08.222686   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.222703   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:08.222710   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:08.222774   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:08.252685   54807 cri.go:89] found id: ""
	I1202 19:16:08.252699   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.252705   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:08.252711   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:08.252767   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:08.281659   54807 cri.go:89] found id: ""
	I1202 19:16:08.281672   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.281679   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:08.281685   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:08.281757   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:08.306909   54807 cri.go:89] found id: ""
	I1202 19:16:08.306922   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.306929   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:08.306936   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:08.306947   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:08.363919   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:08.363938   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:08.375138   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:08.375154   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:08.443392   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:08.435786   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.436517   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438269   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438769   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.439921   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:08.435786   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.436517   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438269   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438769   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.439921   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:08.443414   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:08.443428   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:08.507474   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:08.507492   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:11.037665   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:11.050056   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:11.050130   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:11.076993   54807 cri.go:89] found id: ""
	I1202 19:16:11.077008   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.077015   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:11.077021   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:11.077088   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:11.104370   54807 cri.go:89] found id: ""
	I1202 19:16:11.104384   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.104393   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:11.104399   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:11.104463   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:11.132145   54807 cri.go:89] found id: ""
	I1202 19:16:11.132160   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.132168   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:11.132174   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:11.132235   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:11.158847   54807 cri.go:89] found id: ""
	I1202 19:16:11.158861   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.158868   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:11.158874   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:11.158934   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:11.198715   54807 cri.go:89] found id: ""
	I1202 19:16:11.198729   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.198736   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:11.198741   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:11.198804   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:11.230867   54807 cri.go:89] found id: ""
	I1202 19:16:11.230886   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.230893   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:11.230899   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:11.230957   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:11.259807   54807 cri.go:89] found id: ""
	I1202 19:16:11.259821   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.259828   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:11.259836   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:11.259846   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:11.287151   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:11.287167   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:11.344009   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:11.344032   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:11.354412   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:11.354433   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:11.420896   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:11.412603   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.413632   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415146   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415437   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.416861   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:11.412603   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.413632   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415146   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415437   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.416861   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:11.420906   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:11.420917   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:13.984421   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:13.995238   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:13.995302   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:14.021325   54807 cri.go:89] found id: ""
	I1202 19:16:14.021338   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.021345   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:14.021350   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:14.021407   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:14.047264   54807 cri.go:89] found id: ""
	I1202 19:16:14.047278   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.047285   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:14.047291   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:14.047355   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:14.071231   54807 cri.go:89] found id: ""
	I1202 19:16:14.071245   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.071252   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:14.071257   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:14.071315   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:14.096289   54807 cri.go:89] found id: ""
	I1202 19:16:14.096302   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.096309   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:14.096315   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:14.096397   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:14.122522   54807 cri.go:89] found id: ""
	I1202 19:16:14.122535   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.122542   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:14.122548   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:14.122608   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:14.151408   54807 cri.go:89] found id: ""
	I1202 19:16:14.151422   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.151429   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:14.151435   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:14.151496   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:14.182327   54807 cri.go:89] found id: ""
	I1202 19:16:14.182340   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.182347   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:14.182355   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:14.182365   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:14.246777   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:14.246796   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:14.262093   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:14.262108   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:14.326058   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:14.317581   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.318458   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320176   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320802   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.322292   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:14.317581   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.318458   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320176   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320802   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.322292   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:14.326068   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:14.326080   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:14.388559   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:14.388578   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:16.920108   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:16.930319   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:16.930382   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:16.955799   54807 cri.go:89] found id: ""
	I1202 19:16:16.955813   54807 logs.go:282] 0 containers: []
	W1202 19:16:16.955820   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:16.955825   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:16.955882   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:16.982139   54807 cri.go:89] found id: ""
	I1202 19:16:16.982153   54807 logs.go:282] 0 containers: []
	W1202 19:16:16.982160   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:16.982165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:16.982223   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:17.007837   54807 cri.go:89] found id: ""
	I1202 19:16:17.007851   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.007857   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:17.007863   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:17.007933   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:17.034216   54807 cri.go:89] found id: ""
	I1202 19:16:17.034229   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.034236   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:17.034241   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:17.034298   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:17.063913   54807 cri.go:89] found id: ""
	I1202 19:16:17.063927   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.063934   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:17.063939   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:17.063997   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:17.088826   54807 cri.go:89] found id: ""
	I1202 19:16:17.088840   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.088847   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:17.088853   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:17.088913   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:17.114356   54807 cri.go:89] found id: ""
	I1202 19:16:17.114370   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.114376   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:17.114384   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:17.114394   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:17.171571   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:17.171591   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:17.192662   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:17.192677   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:17.265860   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:17.257716   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.258547   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260144   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260824   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.262474   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:17.257716   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.258547   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260144   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260824   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.262474   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:17.265870   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:17.265883   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:17.329636   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:17.329654   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:19.857139   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:19.867414   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:19.867471   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:19.891736   54807 cri.go:89] found id: ""
	I1202 19:16:19.891750   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.891757   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:19.891762   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:19.891819   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:19.916840   54807 cri.go:89] found id: ""
	I1202 19:16:19.916854   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.916861   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:19.916881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:19.916938   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:19.941623   54807 cri.go:89] found id: ""
	I1202 19:16:19.941636   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.941643   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:19.941649   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:19.941706   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:19.973037   54807 cri.go:89] found id: ""
	I1202 19:16:19.973051   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.973059   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:19.973065   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:19.973134   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:20.000748   54807 cri.go:89] found id: ""
	I1202 19:16:20.000765   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.000773   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:20.000780   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:20.000851   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:20.025854   54807 cri.go:89] found id: ""
	I1202 19:16:20.025868   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.025875   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:20.025881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:20.025940   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:20.052281   54807 cri.go:89] found id: ""
	I1202 19:16:20.052296   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.052304   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:20.052312   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:20.052346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:20.120511   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:20.111945   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.112719   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.114519   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.115208   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.116979   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:20.111945   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.112719   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.114519   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.115208   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.116979   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:20.120542   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:20.120557   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:20.192068   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:20.192088   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:20.232059   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:20.232074   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:20.287505   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:20.287527   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:22.798885   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:22.808880   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:22.808947   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:22.838711   54807 cri.go:89] found id: ""
	I1202 19:16:22.838736   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.838744   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:22.838750   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:22.838815   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:22.866166   54807 cri.go:89] found id: ""
	I1202 19:16:22.866180   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.866187   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:22.866192   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:22.866250   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:22.890456   54807 cri.go:89] found id: ""
	I1202 19:16:22.890470   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.890484   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:22.890490   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:22.890554   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:22.915548   54807 cri.go:89] found id: ""
	I1202 19:16:22.915562   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.915578   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:22.915585   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:22.915643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:22.940011   54807 cri.go:89] found id: ""
	I1202 19:16:22.940025   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.940032   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:22.940037   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:22.940093   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:22.965647   54807 cri.go:89] found id: ""
	I1202 19:16:22.965660   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.965670   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:22.965677   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:22.965744   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:22.994566   54807 cri.go:89] found id: ""
	I1202 19:16:22.994580   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.994587   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:22.994595   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:22.994611   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:23.050953   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:23.050973   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:23.061610   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:23.061624   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:23.127525   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:23.119520   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.120161   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.121888   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.122530   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.124179   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:23.119520   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.120161   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.121888   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.122530   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.124179   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:23.127534   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:23.127546   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:23.194603   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:23.194639   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:25.725656   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:25.735521   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:25.735580   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:25.760626   54807 cri.go:89] found id: ""
	I1202 19:16:25.760640   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.760647   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:25.760652   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:25.760711   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:25.786443   54807 cri.go:89] found id: ""
	I1202 19:16:25.786457   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.786464   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:25.786470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:25.786529   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:25.813975   54807 cri.go:89] found id: ""
	I1202 19:16:25.813989   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.813996   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:25.814001   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:25.814059   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:25.839899   54807 cri.go:89] found id: ""
	I1202 19:16:25.839912   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.839920   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:25.839925   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:25.839983   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:25.869299   54807 cri.go:89] found id: ""
	I1202 19:16:25.869312   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.869319   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:25.869325   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:25.869384   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:25.894364   54807 cri.go:89] found id: ""
	I1202 19:16:25.894379   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.894385   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:25.894391   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:25.894448   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:25.919717   54807 cri.go:89] found id: ""
	I1202 19:16:25.919733   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.919741   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:25.919748   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:25.919759   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:25.988177   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:25.979290   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.979818   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.981491   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.982030   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.983829   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:25.979290   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.979818   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.981491   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.982030   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.983829   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:25.988188   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:25.988198   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:26.052787   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:26.052806   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:26.081027   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:26.081042   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:26.138061   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:26.138079   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:28.650000   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:28.660481   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:28.660541   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:28.685594   54807 cri.go:89] found id: ""
	I1202 19:16:28.685608   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.685616   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:28.685621   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:28.685679   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:28.710399   54807 cri.go:89] found id: ""
	I1202 19:16:28.710412   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.710419   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:28.710425   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:28.710481   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:28.735520   54807 cri.go:89] found id: ""
	I1202 19:16:28.735533   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.735546   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:28.735551   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:28.735607   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:28.762423   54807 cri.go:89] found id: ""
	I1202 19:16:28.762436   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.762443   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:28.762449   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:28.762515   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:28.791746   54807 cri.go:89] found id: ""
	I1202 19:16:28.791760   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.791767   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:28.791772   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:28.791831   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:28.818359   54807 cri.go:89] found id: ""
	I1202 19:16:28.818372   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.818379   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:28.818386   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:28.818443   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:28.846465   54807 cri.go:89] found id: ""
	I1202 19:16:28.846479   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.846486   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:28.846494   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:28.846503   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:28.903412   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:28.903430   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:28.914210   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:28.914267   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:28.978428   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:28.970543   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.971044   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.972803   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.973286   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.974839   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:28.970543   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.971044   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.972803   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.973286   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.974839   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:28.978439   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:28.978450   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:29.041343   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:29.041363   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:31.570595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:31.583500   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:31.583573   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:31.611783   54807 cri.go:89] found id: ""
	I1202 19:16:31.611796   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.611805   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:31.611811   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:31.611868   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:31.639061   54807 cri.go:89] found id: ""
	I1202 19:16:31.639074   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.639081   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:31.639086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:31.639152   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:31.664706   54807 cri.go:89] found id: ""
	I1202 19:16:31.664719   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.664726   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:31.664732   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:31.664789   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:31.688725   54807 cri.go:89] found id: ""
	I1202 19:16:31.688739   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.688746   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:31.688751   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:31.688807   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:31.713308   54807 cri.go:89] found id: ""
	I1202 19:16:31.713321   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.713328   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:31.713333   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:31.713391   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:31.737960   54807 cri.go:89] found id: ""
	I1202 19:16:31.737973   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.737980   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:31.737985   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:31.738041   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:31.766035   54807 cri.go:89] found id: ""
	I1202 19:16:31.766048   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.766055   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:31.766063   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:31.766078   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:31.821307   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:31.821327   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:31.832103   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:31.832118   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:31.894804   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:31.886409   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.887232   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.888852   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.889449   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.891143   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:31.886409   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.887232   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.888852   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.889449   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.891143   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:31.894814   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:31.894824   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:31.958623   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:31.958641   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:34.494532   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:34.504804   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:34.504861   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:34.534339   54807 cri.go:89] found id: ""
	I1202 19:16:34.534359   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.534366   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:34.534372   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:34.534430   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:34.559181   54807 cri.go:89] found id: ""
	I1202 19:16:34.559194   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.559203   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:34.559208   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:34.559266   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:34.583120   54807 cri.go:89] found id: ""
	I1202 19:16:34.583133   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.583139   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:34.583145   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:34.583245   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:34.608256   54807 cri.go:89] found id: ""
	I1202 19:16:34.608269   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.608276   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:34.608282   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:34.608365   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:34.632733   54807 cri.go:89] found id: ""
	I1202 19:16:34.632747   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.632754   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:34.632759   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:34.632821   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:34.663293   54807 cri.go:89] found id: ""
	I1202 19:16:34.663307   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.663314   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:34.663320   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:34.663376   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:34.686842   54807 cri.go:89] found id: ""
	I1202 19:16:34.686856   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.686863   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:34.686871   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:34.686881   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:34.697549   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:34.697564   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:34.764406   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:34.756417   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.757141   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.758783   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.759285   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.760837   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:34.756417   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.757141   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.758783   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.759285   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.760837   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:34.764416   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:34.764427   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:34.827201   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:34.827223   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:34.854552   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:34.854570   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:37.413003   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:37.423382   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:37.423441   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:37.459973   54807 cri.go:89] found id: ""
	I1202 19:16:37.459987   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.459994   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:37.460000   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:37.460062   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:37.494488   54807 cri.go:89] found id: ""
	I1202 19:16:37.494503   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.494510   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:37.494515   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:37.494584   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:37.519270   54807 cri.go:89] found id: ""
	I1202 19:16:37.519283   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.519290   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:37.519295   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:37.519351   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:37.545987   54807 cri.go:89] found id: ""
	I1202 19:16:37.546001   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.546008   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:37.546013   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:37.546069   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:37.574348   54807 cri.go:89] found id: ""
	I1202 19:16:37.574362   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.574369   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:37.574375   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:37.574437   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:37.600075   54807 cri.go:89] found id: ""
	I1202 19:16:37.600089   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.600096   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:37.600102   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:37.600167   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:37.625421   54807 cri.go:89] found id: ""
	I1202 19:16:37.625434   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.625443   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:37.625450   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:37.625460   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:37.688980   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:37.689000   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:37.719329   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:37.719344   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:37.778206   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:37.778225   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:37.789133   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:37.789148   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:37.856498   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:37.848835   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.849672   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851302   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851842   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.853113   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:37.848835   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.849672   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851302   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851842   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.853113   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:40.358183   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:40.368449   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:40.368509   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:40.392705   54807 cri.go:89] found id: ""
	I1202 19:16:40.392721   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.392728   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:40.392734   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:40.392796   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:40.417408   54807 cri.go:89] found id: ""
	I1202 19:16:40.417422   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.417429   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:40.417435   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:40.417493   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:40.458012   54807 cri.go:89] found id: ""
	I1202 19:16:40.458026   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.458033   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:40.458039   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:40.458094   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:40.498315   54807 cri.go:89] found id: ""
	I1202 19:16:40.498328   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.498335   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:40.498341   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:40.498402   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:40.523770   54807 cri.go:89] found id: ""
	I1202 19:16:40.523784   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.523792   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:40.523797   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:40.523865   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:40.549124   54807 cri.go:89] found id: ""
	I1202 19:16:40.549137   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.549144   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:40.549149   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:40.549207   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:40.573667   54807 cri.go:89] found id: ""
	I1202 19:16:40.573680   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.573688   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:40.573696   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:40.573708   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:40.629671   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:40.629688   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:40.640745   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:40.640760   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:40.706165   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:40.697573   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.698405   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700020   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700368   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.702012   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:40.697573   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.698405   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700020   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700368   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.702012   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:40.706175   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:40.706186   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:40.775737   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:40.775755   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:43.307135   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:43.317487   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:43.317553   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:43.342709   54807 cri.go:89] found id: ""
	I1202 19:16:43.342722   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.342730   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:43.342735   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:43.342793   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:43.367380   54807 cri.go:89] found id: ""
	I1202 19:16:43.367393   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.367400   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:43.367406   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:43.367462   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:43.394678   54807 cri.go:89] found id: ""
	I1202 19:16:43.394691   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.394699   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:43.394704   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:43.394761   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:43.421130   54807 cri.go:89] found id: ""
	I1202 19:16:43.421144   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.421151   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:43.421156   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:43.421212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:43.454728   54807 cri.go:89] found id: ""
	I1202 19:16:43.454741   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.454749   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:43.454754   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:43.454810   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:43.491457   54807 cri.go:89] found id: ""
	I1202 19:16:43.491470   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.491477   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:43.491482   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:43.491537   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:43.515943   54807 cri.go:89] found id: ""
	I1202 19:16:43.515957   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.515964   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:43.515972   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:43.515982   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:43.579953   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:43.579972   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:43.608617   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:43.608632   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:43.666586   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:43.666604   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:43.677358   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:43.677374   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:43.741646   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:43.733470   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.734235   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.735923   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.736656   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.738348   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:43.733470   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.734235   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.735923   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.736656   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.738348   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:46.243365   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:46.255599   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:46.255658   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:46.280357   54807 cri.go:89] found id: ""
	I1202 19:16:46.280369   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.280376   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:46.280382   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:46.280444   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:46.304610   54807 cri.go:89] found id: ""
	I1202 19:16:46.304623   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.304630   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:46.304635   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:46.304692   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:46.328944   54807 cri.go:89] found id: ""
	I1202 19:16:46.328957   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.328963   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:46.328968   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:46.329027   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:46.357896   54807 cri.go:89] found id: ""
	I1202 19:16:46.357909   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.357916   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:46.357923   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:46.357981   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:46.381601   54807 cri.go:89] found id: ""
	I1202 19:16:46.381613   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.381620   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:46.381626   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:46.381687   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:46.406928   54807 cri.go:89] found id: ""
	I1202 19:16:46.406942   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.406949   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:46.406954   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:46.407009   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:46.449373   54807 cri.go:89] found id: ""
	I1202 19:16:46.449386   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.449393   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:46.449401   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:46.449411   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:46.516162   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:46.516180   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:46.527166   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:46.527183   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:46.590201   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:46.582136   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.583309   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.584090   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585115   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585711   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:46.582136   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.583309   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.584090   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585115   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585711   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:46.590211   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:46.590221   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:46.652574   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:46.652593   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:49.180131   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:49.190665   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:49.190729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:49.215295   54807 cri.go:89] found id: ""
	I1202 19:16:49.215308   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.215315   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:49.215321   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:49.215382   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:49.241898   54807 cri.go:89] found id: ""
	I1202 19:16:49.241912   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.241919   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:49.241925   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:49.241986   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:49.266638   54807 cri.go:89] found id: ""
	I1202 19:16:49.266651   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.266658   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:49.266664   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:49.266719   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:49.292478   54807 cri.go:89] found id: ""
	I1202 19:16:49.292496   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.292506   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:49.292512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:49.292589   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:49.318280   54807 cri.go:89] found id: ""
	I1202 19:16:49.318293   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.318300   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:49.318306   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:49.318373   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:49.351760   54807 cri.go:89] found id: ""
	I1202 19:16:49.351774   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.351787   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:49.351793   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:49.351854   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:49.376513   54807 cri.go:89] found id: ""
	I1202 19:16:49.376536   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.376543   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:49.376551   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:49.376563   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:49.448960   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:49.448987   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:49.482655   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:49.482673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:49.541305   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:49.541322   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:49.552971   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:49.552988   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:49.618105   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:49.609636   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.610374   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612148   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612903   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.614517   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:49.609636   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.610374   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612148   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612903   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.614517   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:52.119791   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:52.130607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:52.130669   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:52.155643   54807 cri.go:89] found id: ""
	I1202 19:16:52.155656   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.155663   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:52.155669   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:52.155729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:52.179230   54807 cri.go:89] found id: ""
	I1202 19:16:52.179244   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.179253   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:52.179259   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:52.179316   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:52.203772   54807 cri.go:89] found id: ""
	I1202 19:16:52.203785   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.203792   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:52.203798   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:52.203852   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:52.236168   54807 cri.go:89] found id: ""
	I1202 19:16:52.236183   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.236190   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:52.236196   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:52.236257   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:52.260979   54807 cri.go:89] found id: ""
	I1202 19:16:52.260995   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.261003   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:52.261008   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:52.261063   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:52.284287   54807 cri.go:89] found id: ""
	I1202 19:16:52.284299   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.284306   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:52.284312   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:52.284385   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:52.310376   54807 cri.go:89] found id: ""
	I1202 19:16:52.310390   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.310397   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:52.310405   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:52.310415   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:52.366619   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:52.366636   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:52.377556   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:52.377572   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:52.453208   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:52.444029   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.444937   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.446866   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.447608   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.449367   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:52.444029   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.444937   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.446866   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.447608   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.449367   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:52.453218   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:52.453229   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:52.524196   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:52.524214   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:55.052717   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:55.063878   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:55.063943   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:55.089568   54807 cri.go:89] found id: ""
	I1202 19:16:55.089582   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.089588   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:55.089594   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:55.089658   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:55.116741   54807 cri.go:89] found id: ""
	I1202 19:16:55.116755   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.116762   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:55.116768   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:55.116825   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:55.142748   54807 cri.go:89] found id: ""
	I1202 19:16:55.142761   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.142768   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:55.142774   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:55.142836   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:55.167341   54807 cri.go:89] found id: ""
	I1202 19:16:55.167354   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.167361   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:55.167367   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:55.167424   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:55.194118   54807 cri.go:89] found id: ""
	I1202 19:16:55.194132   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.194139   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:55.194144   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:55.194201   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:55.218379   54807 cri.go:89] found id: ""
	I1202 19:16:55.218393   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.218400   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:55.218406   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:55.218465   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:55.243035   54807 cri.go:89] found id: ""
	I1202 19:16:55.243048   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.243055   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:55.243063   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:55.243073   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:55.310493   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:55.302627   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.303246   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.304790   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.305303   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.306777   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:55.302627   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.303246   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.304790   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.305303   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.306777   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:55.310504   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:55.310517   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:55.373914   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:55.373933   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:55.405157   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:55.405172   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:55.473565   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:55.473583   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:57.986363   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:57.996902   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:57.996969   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:58.023028   54807 cri.go:89] found id: ""
	I1202 19:16:58.023042   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.023049   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:58.023055   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:58.023113   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:58.049927   54807 cri.go:89] found id: ""
	I1202 19:16:58.049941   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.049947   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:58.049953   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:58.050013   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:58.078428   54807 cri.go:89] found id: ""
	I1202 19:16:58.078448   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.078456   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:58.078461   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:58.078516   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:58.105365   54807 cri.go:89] found id: ""
	I1202 19:16:58.105377   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.105385   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:58.105390   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:58.105448   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:58.129444   54807 cri.go:89] found id: ""
	I1202 19:16:58.129458   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.129465   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:58.129470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:58.129531   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:58.157574   54807 cri.go:89] found id: ""
	I1202 19:16:58.157588   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.157594   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:58.157607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:58.157670   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:58.182028   54807 cri.go:89] found id: ""
	I1202 19:16:58.182042   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.182049   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:58.182057   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:58.182067   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:58.241166   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:58.241184   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:58.252367   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:58.252383   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:58.319914   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:58.311929   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.312824   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.314498   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.315073   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.316444   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:58.311929   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.312824   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.314498   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.315073   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.316444   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:58.319925   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:58.319937   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:58.381228   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:58.381246   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:00.909644   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:00.920924   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:00.921037   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:00.947793   54807 cri.go:89] found id: ""
	I1202 19:17:00.947812   54807 logs.go:282] 0 containers: []
	W1202 19:17:00.947820   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:00.947828   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:00.947900   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:00.975539   54807 cri.go:89] found id: ""
	I1202 19:17:00.975553   54807 logs.go:282] 0 containers: []
	W1202 19:17:00.975561   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:00.975566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:00.975629   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:01.002532   54807 cri.go:89] found id: ""
	I1202 19:17:01.002549   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.002560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:01.002566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:01.002636   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:01.032211   54807 cri.go:89] found id: ""
	I1202 19:17:01.032226   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.032233   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:01.032239   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:01.032302   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:01.059398   54807 cri.go:89] found id: ""
	I1202 19:17:01.059413   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.059420   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:01.059426   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:01.059486   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:01.091722   54807 cri.go:89] found id: ""
	I1202 19:17:01.091740   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.091746   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:01.091752   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:01.091816   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:01.117849   54807 cri.go:89] found id: ""
	I1202 19:17:01.117864   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.117871   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:01.117879   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:01.117893   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:01.191972   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:01.182202   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.183119   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.185191   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.186030   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.187874   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:01.182202   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.183119   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.185191   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.186030   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.187874   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:01.191984   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:01.191997   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:01.260783   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:01.260806   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:01.290665   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:01.290683   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:01.348633   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:01.348653   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:03.860845   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:03.871899   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:03.871966   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:03.899158   54807 cri.go:89] found id: ""
	I1202 19:17:03.899172   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.899179   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:03.899185   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:03.899244   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:03.925147   54807 cri.go:89] found id: ""
	I1202 19:17:03.925161   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.925168   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:03.925174   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:03.925235   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:03.955130   54807 cri.go:89] found id: ""
	I1202 19:17:03.955143   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.955150   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:03.955156   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:03.955215   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:03.983272   54807 cri.go:89] found id: ""
	I1202 19:17:03.983286   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.983294   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:03.983300   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:03.983371   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:04.009435   54807 cri.go:89] found id: ""
	I1202 19:17:04.009449   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.009456   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:04.009463   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:04.009523   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:04.037346   54807 cri.go:89] found id: ""
	I1202 19:17:04.037360   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.037368   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:04.037374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:04.037433   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:04.066662   54807 cri.go:89] found id: ""
	I1202 19:17:04.066675   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.066682   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:04.066690   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:04.066701   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:04.125350   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:04.125369   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:04.136698   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:04.136716   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:04.206327   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:04.198119   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.198761   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.200411   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.201024   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.202642   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:04.198119   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.198761   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.200411   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.201024   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.202642   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:04.206338   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:04.206353   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:04.274588   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:04.274608   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:06.806010   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:06.817189   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:06.817256   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:06.843114   54807 cri.go:89] found id: ""
	I1202 19:17:06.843129   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.843136   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:06.843142   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:06.843218   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:06.873921   54807 cri.go:89] found id: ""
	I1202 19:17:06.873947   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.873955   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:06.873961   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:06.874045   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:06.900636   54807 cri.go:89] found id: ""
	I1202 19:17:06.900651   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.900658   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:06.900664   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:06.900724   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:06.928484   54807 cri.go:89] found id: ""
	I1202 19:17:06.928504   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.928512   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:06.928518   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:06.928583   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:06.956137   54807 cri.go:89] found id: ""
	I1202 19:17:06.956170   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.956179   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:06.956184   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:06.956258   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:06.987383   54807 cri.go:89] found id: ""
	I1202 19:17:06.987408   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.987416   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:06.987422   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:06.987495   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:07.013712   54807 cri.go:89] found id: ""
	I1202 19:17:07.013726   54807 logs.go:282] 0 containers: []
	W1202 19:17:07.013733   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:07.013741   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:07.013756   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:07.076937   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:07.076955   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:07.106847   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:07.106863   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:07.164565   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:07.164584   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:07.177132   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:07.177154   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:07.245572   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:07.237375   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.238081   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.239663   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.240205   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.241966   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:07.237375   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.238081   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.239663   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.240205   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.241966   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:09.745822   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:09.756122   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:09.756180   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:09.784649   54807 cri.go:89] found id: ""
	I1202 19:17:09.784663   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.784670   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:09.784675   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:09.784732   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:09.809632   54807 cri.go:89] found id: ""
	I1202 19:17:09.809655   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.809662   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:09.809668   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:09.809733   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:09.839403   54807 cri.go:89] found id: ""
	I1202 19:17:09.839425   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.839433   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:09.839439   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:09.839504   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:09.868977   54807 cri.go:89] found id: ""
	I1202 19:17:09.868991   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.868999   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:09.869004   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:09.869064   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:09.894156   54807 cri.go:89] found id: ""
	I1202 19:17:09.894170   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.894176   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:09.894182   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:09.894237   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:09.919174   54807 cri.go:89] found id: ""
	I1202 19:17:09.919188   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.919195   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:09.919200   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:09.919261   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:09.944620   54807 cri.go:89] found id: ""
	I1202 19:17:09.944632   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.944639   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:09.944647   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:09.944657   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:10.004028   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:10.004049   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:10.015962   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:10.015979   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:10.086133   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:10.078544   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.079196   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.080924   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.081452   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.082613   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:10.078544   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.079196   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.080924   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.081452   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.082613   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:10.086143   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:10.086153   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:10.148419   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:10.148437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:12.676458   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:12.687083   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:12.687155   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:12.712590   54807 cri.go:89] found id: ""
	I1202 19:17:12.712604   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.712611   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:12.712616   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:12.712674   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:12.737565   54807 cri.go:89] found id: ""
	I1202 19:17:12.737578   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.737585   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:12.737591   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:12.737648   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:12.762201   54807 cri.go:89] found id: ""
	I1202 19:17:12.762216   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.762223   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:12.762228   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:12.762288   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:12.786736   54807 cri.go:89] found id: ""
	I1202 19:17:12.786750   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.786758   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:12.786763   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:12.786825   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:12.811994   54807 cri.go:89] found id: ""
	I1202 19:17:12.812008   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.812015   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:12.812020   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:12.812078   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:12.838580   54807 cri.go:89] found id: ""
	I1202 19:17:12.838593   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.838600   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:12.838605   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:12.838659   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:12.863652   54807 cri.go:89] found id: ""
	I1202 19:17:12.863665   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.863672   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:12.863679   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:12.863689   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:12.918766   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:12.918784   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:12.930406   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:12.930428   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:13.000633   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:12.992135   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.992970   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994592   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994977   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.996559   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:12.992135   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.992970   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994592   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994977   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.996559   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:13.000643   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:13.000655   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:13.065384   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:13.065403   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:15.594382   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:15.604731   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:15.604795   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:15.634332   54807 cri.go:89] found id: ""
	I1202 19:17:15.634345   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.634353   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:15.634358   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:15.634434   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:15.663126   54807 cri.go:89] found id: ""
	I1202 19:17:15.663141   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.663148   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:15.663153   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:15.663217   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:15.699033   54807 cri.go:89] found id: ""
	I1202 19:17:15.699051   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.699059   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:15.699065   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:15.699121   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:15.727044   54807 cri.go:89] found id: ""
	I1202 19:17:15.727057   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.727065   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:15.727071   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:15.727129   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:15.754131   54807 cri.go:89] found id: ""
	I1202 19:17:15.754152   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.754159   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:15.754165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:15.754224   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:15.778325   54807 cri.go:89] found id: ""
	I1202 19:17:15.778338   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.778345   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:15.778350   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:15.778407   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:15.803363   54807 cri.go:89] found id: ""
	I1202 19:17:15.803376   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.803383   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:15.803391   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:15.803403   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:15.814039   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:15.814055   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:15.885494   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:15.877151   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.877774   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.878766   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880347   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880955   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:15.877151   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.877774   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.878766   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880347   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880955   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:15.885505   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:15.885516   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:15.947276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:15.947295   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:15.979963   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:15.979981   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:18.538313   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:18.548423   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:18.548490   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:18.571700   54807 cri.go:89] found id: ""
	I1202 19:17:18.571714   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.571721   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:18.571726   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:18.571784   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:18.600197   54807 cri.go:89] found id: ""
	I1202 19:17:18.600211   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.600219   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:18.600224   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:18.600279   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:18.628309   54807 cri.go:89] found id: ""
	I1202 19:17:18.628341   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.628348   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:18.628353   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:18.628440   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:18.654241   54807 cri.go:89] found id: ""
	I1202 19:17:18.654255   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.654263   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:18.654268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:18.654325   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:18.690109   54807 cri.go:89] found id: ""
	I1202 19:17:18.690123   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.690130   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:18.690135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:18.690194   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:18.719625   54807 cri.go:89] found id: ""
	I1202 19:17:18.719638   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.719646   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:18.719651   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:18.719713   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:18.753094   54807 cri.go:89] found id: ""
	I1202 19:17:18.753108   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.753116   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:18.753124   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:18.753135   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:18.782592   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:18.782608   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:18.837738   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:18.837757   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:18.848921   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:18.848937   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:18.918012   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:18.908756   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.909735   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.911590   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.912295   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.913925   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:18.908756   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.909735   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.911590   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.912295   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.913925   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:18.918023   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:18.918034   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:21.481252   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:21.491493   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:21.491550   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:21.515967   54807 cri.go:89] found id: ""
	I1202 19:17:21.515980   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.515987   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:21.515993   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:21.516049   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:21.545239   54807 cri.go:89] found id: ""
	I1202 19:17:21.545256   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.545263   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:21.545268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:21.545349   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:21.574561   54807 cri.go:89] found id: ""
	I1202 19:17:21.574575   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.574582   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:21.574588   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:21.574643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:21.600546   54807 cri.go:89] found id: ""
	I1202 19:17:21.600567   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.600575   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:21.600581   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:21.600647   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:21.625602   54807 cri.go:89] found id: ""
	I1202 19:17:21.625616   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.625623   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:21.625629   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:21.625691   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:21.650573   54807 cri.go:89] found id: ""
	I1202 19:17:21.650586   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.650593   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:21.650599   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:21.650655   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:21.680099   54807 cri.go:89] found id: ""
	I1202 19:17:21.680113   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.680120   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:21.680128   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:21.680155   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:21.750582   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:21.750601   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:21.762564   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:21.762580   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:21.827497   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:21.819360   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.820080   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.821817   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.822472   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.824049   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:21.819360   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.820080   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.821817   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.822472   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.824049   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:21.827507   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:21.827518   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:21.889794   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:21.889812   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:24.421754   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:24.432162   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:24.432233   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:24.456800   54807 cri.go:89] found id: ""
	I1202 19:17:24.456814   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.456821   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:24.456826   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:24.456901   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:24.481502   54807 cri.go:89] found id: ""
	I1202 19:17:24.481516   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.481523   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:24.481529   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:24.481587   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:24.505876   54807 cri.go:89] found id: ""
	I1202 19:17:24.505918   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.505925   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:24.505931   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:24.505990   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:24.530651   54807 cri.go:89] found id: ""
	I1202 19:17:24.530665   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.530673   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:24.530689   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:24.530749   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:24.556247   54807 cri.go:89] found id: ""
	I1202 19:17:24.556260   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.556277   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:24.556283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:24.556391   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:24.585748   54807 cri.go:89] found id: ""
	I1202 19:17:24.585761   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.585769   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:24.585774   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:24.585833   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:24.610350   54807 cri.go:89] found id: ""
	I1202 19:17:24.610363   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.610370   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:24.610377   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:24.610388   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:24.680866   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:24.664630   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.665302   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667106   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667645   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.669330   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:24.664630   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.665302   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667106   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667645   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.669330   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:24.680876   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:24.680887   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:24.756955   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:24.756975   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:24.784854   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:24.784869   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:24.849848   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:24.849872   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:27.361613   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:27.375047   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:27.375145   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:27.399753   54807 cri.go:89] found id: ""
	I1202 19:17:27.399767   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.399774   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:27.399780   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:27.399838   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:27.430016   54807 cri.go:89] found id: ""
	I1202 19:17:27.430030   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.430037   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:27.430043   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:27.430102   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:27.455165   54807 cri.go:89] found id: ""
	I1202 19:17:27.455178   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.455186   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:27.455191   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:27.455251   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:27.481353   54807 cri.go:89] found id: ""
	I1202 19:17:27.481367   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.481374   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:27.481380   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:27.481437   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:27.505602   54807 cri.go:89] found id: ""
	I1202 19:17:27.505615   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.505622   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:27.505627   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:27.505685   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:27.531062   54807 cri.go:89] found id: ""
	I1202 19:17:27.531075   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.531082   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:27.531087   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:27.531143   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:27.556614   54807 cri.go:89] found id: ""
	I1202 19:17:27.556628   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.556635   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:27.556642   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:27.556653   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:27.623535   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:27.615982   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.616782   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618353   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618650   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.620147   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:27.615982   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.616782   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618353   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618650   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.620147   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:27.623546   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:27.623557   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:27.692276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:27.692294   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:27.728468   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:27.728489   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:27.790653   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:27.790670   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:30.302100   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:30.313066   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:30.313144   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:30.340123   54807 cri.go:89] found id: ""
	I1202 19:17:30.340137   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.340144   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:30.340149   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:30.340208   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:30.365806   54807 cri.go:89] found id: ""
	I1202 19:17:30.365820   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.365835   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:30.365841   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:30.365904   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:30.391688   54807 cri.go:89] found id: ""
	I1202 19:17:30.391701   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.391708   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:30.391714   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:30.391771   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:30.416982   54807 cri.go:89] found id: ""
	I1202 19:17:30.416996   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.417013   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:30.417019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:30.417117   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:30.443139   54807 cri.go:89] found id: ""
	I1202 19:17:30.443153   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.443162   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:30.443168   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:30.443226   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:30.468557   54807 cri.go:89] found id: ""
	I1202 19:17:30.468571   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.468579   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:30.468584   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:30.468641   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:30.494467   54807 cri.go:89] found id: ""
	I1202 19:17:30.494480   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.494488   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:30.494502   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:30.494515   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:30.551986   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:30.552005   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:30.563168   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:30.563184   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:30.628562   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:30.620608   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.621322   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623022   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623454   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.625008   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:30.620608   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.621322   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623022   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623454   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.625008   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:30.628573   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:30.628584   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:30.691460   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:30.691478   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:33.223672   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:33.234425   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:33.234485   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:33.262500   54807 cri.go:89] found id: ""
	I1202 19:17:33.262514   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.262521   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:33.262527   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:33.262590   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:33.287888   54807 cri.go:89] found id: ""
	I1202 19:17:33.287902   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.287921   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:33.287926   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:33.287995   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:33.314581   54807 cri.go:89] found id: ""
	I1202 19:17:33.314594   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.314601   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:33.314607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:33.314671   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:33.338734   54807 cri.go:89] found id: ""
	I1202 19:17:33.338747   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.338755   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:33.338760   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:33.338818   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:33.363343   54807 cri.go:89] found id: ""
	I1202 19:17:33.363356   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.363363   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:33.363369   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:33.363425   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:33.388256   54807 cri.go:89] found id: ""
	I1202 19:17:33.388270   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.388277   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:33.388283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:33.388360   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:33.412424   54807 cri.go:89] found id: ""
	I1202 19:17:33.412449   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.412456   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:33.412465   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:33.412475   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:33.467817   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:33.467835   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:33.479194   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:33.479209   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:33.548484   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:33.540299   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.540851   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.542579   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.543074   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.544871   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:33.540299   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.540851   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.542579   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.543074   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.544871   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:33.548494   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:33.548505   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:33.612889   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:33.612909   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:36.146985   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:36.158019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:36.158079   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:36.188906   54807 cri.go:89] found id: ""
	I1202 19:17:36.188919   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.188932   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:36.188938   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:36.188996   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:36.213390   54807 cri.go:89] found id: ""
	I1202 19:17:36.213404   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.213411   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:36.213416   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:36.213481   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:36.242801   54807 cri.go:89] found id: ""
	I1202 19:17:36.242814   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.242822   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:36.242827   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:36.242882   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:36.269121   54807 cri.go:89] found id: ""
	I1202 19:17:36.269142   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.269149   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:36.269155   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:36.269212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:36.295182   54807 cri.go:89] found id: ""
	I1202 19:17:36.295196   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.295203   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:36.295208   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:36.295265   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:36.320684   54807 cri.go:89] found id: ""
	I1202 19:17:36.320698   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.320705   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:36.320711   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:36.320783   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:36.347524   54807 cri.go:89] found id: ""
	I1202 19:17:36.347537   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.347545   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:36.347553   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:36.347564   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:36.358349   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:36.358364   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:36.419970   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:36.412170   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.412950   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414493   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414845   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.416368   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:36.412170   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.412950   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414493   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414845   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.416368   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:36.419980   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:36.419991   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:36.482180   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:36.482199   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:36.511443   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:36.511458   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:39.067437   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:39.077694   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:39.077763   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:39.102742   54807 cri.go:89] found id: ""
	I1202 19:17:39.102755   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.102762   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:39.102768   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:39.102824   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:39.127352   54807 cri.go:89] found id: ""
	I1202 19:17:39.127365   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.127371   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:39.127376   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:39.127433   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:39.155704   54807 cri.go:89] found id: ""
	I1202 19:17:39.155717   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.155725   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:39.155730   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:39.155793   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:39.181102   54807 cri.go:89] found id: ""
	I1202 19:17:39.181121   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.181128   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:39.181133   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:39.181193   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:39.204855   54807 cri.go:89] found id: ""
	I1202 19:17:39.204869   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.204876   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:39.204881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:39.204936   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:39.228875   54807 cri.go:89] found id: ""
	I1202 19:17:39.228889   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.228896   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:39.228901   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:39.228961   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:39.254647   54807 cri.go:89] found id: ""
	I1202 19:17:39.254661   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.254668   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:39.254681   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:39.254696   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:39.266611   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:39.266628   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:39.329195   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:39.321572   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.322010   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323495   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323811   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.325283   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:39.321572   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.322010   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323495   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323811   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.325283   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:39.329204   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:39.329215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:39.390326   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:39.390345   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:39.419151   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:39.419176   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:41.975528   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:41.989057   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:41.989132   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:42.018363   54807 cri.go:89] found id: ""
	I1202 19:17:42.018376   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.018384   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:42.018390   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:42.018453   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:42.045176   54807 cri.go:89] found id: ""
	I1202 19:17:42.045192   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.045200   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:42.045206   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:42.045290   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:42.075758   54807 cri.go:89] found id: ""
	I1202 19:17:42.075773   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.075781   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:42.075787   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:42.075856   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:42.111739   54807 cri.go:89] found id: ""
	I1202 19:17:42.111754   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.111760   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:42.111767   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:42.111829   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:42.141340   54807 cri.go:89] found id: ""
	I1202 19:17:42.141358   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.141368   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:42.141374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:42.141453   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:42.171125   54807 cri.go:89] found id: ""
	I1202 19:17:42.171140   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.171159   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:42.171166   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:42.171236   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:42.200254   54807 cri.go:89] found id: ""
	I1202 19:17:42.200272   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.200280   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:42.200292   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:42.200307   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:42.256751   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:42.256772   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:42.269101   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:42.269118   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:42.336339   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:42.328432   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.329095   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.330828   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.331361   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.332853   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:42.328432   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.329095   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.330828   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.331361   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.332853   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:42.336350   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:42.336361   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:42.397522   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:42.397540   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:44.932481   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:44.944310   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:44.944439   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:44.992545   54807 cri.go:89] found id: ""
	I1202 19:17:44.992561   54807 logs.go:282] 0 containers: []
	W1202 19:17:44.992568   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:44.992574   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:44.992643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:45.041739   54807 cri.go:89] found id: ""
	I1202 19:17:45.041756   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.041764   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:45.041770   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:45.041849   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:45.083378   54807 cri.go:89] found id: ""
	I1202 19:17:45.083394   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.083402   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:45.083407   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:45.083483   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:45.119179   54807 cri.go:89] found id: ""
	I1202 19:17:45.119206   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.119214   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:45.119220   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:45.119340   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:45.156515   54807 cri.go:89] found id: ""
	I1202 19:17:45.156574   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.156583   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:45.156590   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:45.156760   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:45.195862   54807 cri.go:89] found id: ""
	I1202 19:17:45.195877   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.195885   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:45.195892   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:45.195968   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:45.229425   54807 cri.go:89] found id: ""
	I1202 19:17:45.229448   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.229457   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:45.229466   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:45.229477   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:45.293109   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:45.293125   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:45.303969   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:45.303985   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:45.371653   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:45.362842   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.363639   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.365366   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.366008   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.367671   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:45.362842   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.363639   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.365366   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.366008   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.367671   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:45.371662   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:45.371673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:45.436450   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:45.436469   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:47.967684   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:47.979933   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:47.980001   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:48.006489   54807 cri.go:89] found id: ""
	I1202 19:17:48.006503   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.006511   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:48.006517   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:48.006580   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:48.035723   54807 cri.go:89] found id: ""
	I1202 19:17:48.035737   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.035745   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:48.035760   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:48.035820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:48.065220   54807 cri.go:89] found id: ""
	I1202 19:17:48.065233   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.065251   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:48.065260   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:48.065332   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:48.088782   54807 cri.go:89] found id: ""
	I1202 19:17:48.088796   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.088803   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:48.088809   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:48.088865   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:48.113775   54807 cri.go:89] found id: ""
	I1202 19:17:48.113788   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.113799   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:48.113808   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:48.113867   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:48.140235   54807 cri.go:89] found id: ""
	I1202 19:17:48.140248   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.140254   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:48.140260   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:48.140315   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:48.166089   54807 cri.go:89] found id: ""
	I1202 19:17:48.166102   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.166108   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:48.166116   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:48.166126   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:48.192826   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:48.192842   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:48.248078   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:48.248098   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:48.258722   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:48.258737   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:48.323436   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:48.316144   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.316536   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318116   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318415   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.319885   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:48.316144   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.316536   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318116   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318415   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.319885   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:48.323445   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:48.323456   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:50.885477   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:50.895878   54807 kubeadm.go:602] duration metric: took 4m3.997047772s to restartPrimaryControlPlane
	W1202 19:17:50.895945   54807 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1202 19:17:50.896022   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 19:17:51.304711   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:17:51.317725   54807 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 19:17:51.325312   54807 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 19:17:51.325381   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:17:51.332895   54807 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 19:17:51.332904   54807 kubeadm.go:158] found existing configuration files:
	
	I1202 19:17:51.332954   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:17:51.340776   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 19:17:51.340830   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 19:17:51.348141   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:17:51.355804   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 19:17:51.355867   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:17:51.363399   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:17:51.371055   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 19:17:51.371110   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:17:51.378528   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:17:51.386558   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 19:17:51.386618   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:17:51.394349   54807 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 19:17:51.435339   54807 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 19:17:51.435446   54807 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 19:17:51.512672   54807 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 19:17:51.512738   54807 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 19:17:51.512772   54807 kubeadm.go:319] OS: Linux
	I1202 19:17:51.512816   54807 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 19:17:51.512863   54807 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 19:17:51.512909   54807 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 19:17:51.512961   54807 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 19:17:51.513009   54807 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 19:17:51.513055   54807 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 19:17:51.513099   54807 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 19:17:51.513146   54807 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 19:17:51.513190   54807 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 19:17:51.580412   54807 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 19:17:51.580517   54807 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 19:17:51.580607   54807 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 19:17:51.588752   54807 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 19:17:51.594117   54807 out.go:252]   - Generating certificates and keys ...
	I1202 19:17:51.594201   54807 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 19:17:51.594273   54807 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 19:17:51.594354   54807 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 19:17:51.594424   54807 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 19:17:51.594494   54807 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 19:17:51.594547   54807 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 19:17:51.594610   54807 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 19:17:51.594671   54807 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 19:17:51.594744   54807 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 19:17:51.594818   54807 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 19:17:51.594855   54807 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 19:17:51.594910   54807 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 19:17:51.705531   54807 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 19:17:51.854203   54807 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 19:17:52.029847   54807 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 19:17:52.545269   54807 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 19:17:52.727822   54807 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 19:17:52.728412   54807 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 19:17:52.730898   54807 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 19:17:52.734122   54807 out.go:252]   - Booting up control plane ...
	I1202 19:17:52.734222   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 19:17:52.734305   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 19:17:52.734375   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 19:17:52.754118   54807 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 19:17:52.754386   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 19:17:52.762146   54807 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 19:17:52.762405   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 19:17:52.762460   54807 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 19:17:52.891581   54807 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 19:17:52.891694   54807 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 19:21:52.892779   54807 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001197768s
	I1202 19:21:52.892808   54807 kubeadm.go:319] 
	I1202 19:21:52.892871   54807 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 19:21:52.892903   54807 kubeadm.go:319] 	- The kubelet is not running
	I1202 19:21:52.893025   54807 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 19:21:52.893030   54807 kubeadm.go:319] 
	I1202 19:21:52.893133   54807 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 19:21:52.893170   54807 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 19:21:52.893200   54807 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 19:21:52.893203   54807 kubeadm.go:319] 
	I1202 19:21:52.897451   54807 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 19:21:52.897878   54807 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 19:21:52.897986   54807 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 19:21:52.898220   54807 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 19:21:52.898225   54807 kubeadm.go:319] 
	I1202 19:21:52.898299   54807 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1202 19:21:52.898412   54807 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001197768s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1202 19:21:52.898501   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 19:21:53.323346   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:21:53.337542   54807 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 19:21:53.337600   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:21:53.345331   54807 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 19:21:53.345341   54807 kubeadm.go:158] found existing configuration files:
	
	I1202 19:21:53.345394   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:21:53.352948   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 19:21:53.353002   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 19:21:53.360251   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:21:53.367769   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 19:21:53.367833   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:21:53.375319   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:21:53.383107   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 19:21:53.383164   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:21:53.390823   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:21:53.398923   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 19:21:53.398982   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:21:53.406858   54807 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 19:21:53.455640   54807 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 19:21:53.455689   54807 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 19:21:53.530940   54807 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 19:21:53.531008   54807 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 19:21:53.531042   54807 kubeadm.go:319] OS: Linux
	I1202 19:21:53.531086   54807 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 19:21:53.531133   54807 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 19:21:53.531179   54807 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 19:21:53.531226   54807 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 19:21:53.531273   54807 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 19:21:53.531320   54807 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 19:21:53.531364   54807 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 19:21:53.531410   54807 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 19:21:53.531455   54807 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 19:21:53.605461   54807 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 19:21:53.605584   54807 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 19:21:53.605706   54807 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 19:21:53.611090   54807 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 19:21:53.616552   54807 out.go:252]   - Generating certificates and keys ...
	I1202 19:21:53.616667   54807 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 19:21:53.616734   54807 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 19:21:53.616826   54807 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 19:21:53.616887   54807 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 19:21:53.616955   54807 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 19:21:53.617008   54807 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 19:21:53.617070   54807 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 19:21:53.617132   54807 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 19:21:53.617207   54807 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 19:21:53.617278   54807 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 19:21:53.617314   54807 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 19:21:53.617369   54807 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 19:21:53.704407   54807 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 19:21:53.921613   54807 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 19:21:54.521217   54807 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 19:21:54.609103   54807 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 19:21:54.800380   54807 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 19:21:54.800923   54807 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 19:21:54.803676   54807 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 19:21:54.806989   54807 out.go:252]   - Booting up control plane ...
	I1202 19:21:54.807091   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 19:21:54.807173   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 19:21:54.807243   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 19:21:54.831648   54807 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 19:21:54.831750   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 19:21:54.839547   54807 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 19:21:54.840014   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 19:21:54.840081   54807 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 19:21:54.986075   54807 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 19:21:54.986189   54807 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 19:25:54.986676   54807 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001082452s
	I1202 19:25:54.986700   54807 kubeadm.go:319] 
	I1202 19:25:54.986752   54807 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 19:25:54.986782   54807 kubeadm.go:319] 	- The kubelet is not running
	I1202 19:25:54.986880   54807 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 19:25:54.986884   54807 kubeadm.go:319] 
	I1202 19:25:54.986982   54807 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 19:25:54.987011   54807 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 19:25:54.987040   54807 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 19:25:54.987043   54807 kubeadm.go:319] 
	I1202 19:25:54.991498   54807 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 19:25:54.991923   54807 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 19:25:54.992031   54807 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 19:25:54.992264   54807 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 19:25:54.992269   54807 kubeadm.go:319] 
	I1202 19:25:54.992355   54807 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1202 19:25:54.992407   54807 kubeadm.go:403] duration metric: took 12m8.130118214s to StartCluster
	I1202 19:25:54.992437   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:25:54.992498   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:25:55.018059   54807 cri.go:89] found id: ""
	I1202 19:25:55.018073   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.018079   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:25:55.018085   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:25:55.018141   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:25:55.046728   54807 cri.go:89] found id: ""
	I1202 19:25:55.046741   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.046749   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:25:55.046755   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:25:55.046820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:25:55.073607   54807 cri.go:89] found id: ""
	I1202 19:25:55.073621   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.073629   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:25:55.073638   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:25:55.073698   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:25:55.098149   54807 cri.go:89] found id: ""
	I1202 19:25:55.098163   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.098170   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:25:55.098175   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:25:55.098231   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:25:55.126700   54807 cri.go:89] found id: ""
	I1202 19:25:55.126714   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.126721   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:25:55.126727   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:25:55.126783   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:25:55.151684   54807 cri.go:89] found id: ""
	I1202 19:25:55.151697   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.151704   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:25:55.151718   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:25:55.151776   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:25:55.179814   54807 cri.go:89] found id: ""
	I1202 19:25:55.179827   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.179834   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:25:55.179842   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:25:55.179852   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:25:55.209677   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:25:55.209693   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:25:55.267260   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:25:55.267277   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:25:55.278280   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:25:55.278301   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:25:55.341995   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:25:55.334367   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.334759   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336266   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336611   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.338027   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:25:55.334367   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.334759   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336266   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336611   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.338027   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:25:55.342006   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:25:55.342016   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	W1202 19:25:55.404636   54807 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1202 19:25:55.404681   54807 out.go:285] * 
	W1202 19:25:55.404792   54807 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 19:25:55.404837   54807 out.go:285] * 
	W1202 19:25:55.406981   54807 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 19:25:55.412566   54807 out.go:203] 
	W1202 19:25:55.416194   54807 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 19:25:55.416239   54807 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1202 19:25:55.416259   54807 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1202 19:25:55.420152   54807 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578599853Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578662622Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578725071Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578783803Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578855024Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578924119Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578991820Z" level=info msg="runtime interface created"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579043832Z" level=info msg="created NRI interface"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579105847Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579207451Z" level=info msg="Connect containerd service"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579595759Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.580416453Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590441353Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590507150Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590537673Z" level=info msg="Start subscribing containerd event"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590591277Z" level=info msg="Start recovering state"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614386130Z" level=info msg="Start event monitor"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614577326Z" level=info msg="Start cni network conf syncer for default"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614677601Z" level=info msg="Start streaming server"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614762451Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614968071Z" level=info msg="runtime interface starting up..."
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615037774Z" level=info msg="starting plugins..."
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615100272Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615329048Z" level=info msg="containerd successfully booted in 0.058232s"
	Dec 02 19:13:45 functional-449836 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:27:53.002829   23651 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:53.003323   23651 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:53.004618   23651 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:53.005168   23651 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:53.006819   23651 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:27:53 up  1:10,  0 user,  load average: 0.67, 0.30, 0.36
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:27:49 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:50 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 474.
	Dec 02 19:27:50 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:50 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:50 functional-449836 kubelet[23486]: E1202 19:27:50.719708   23486 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:50 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:50 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:51 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 475.
	Dec 02 19:27:51 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:51 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:51 functional-449836 kubelet[23525]: E1202 19:27:51.419631   23525 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:51 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:51 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:52 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 476.
	Dec 02 19:27:52 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:52 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:52 functional-449836 kubelet[23563]: E1202 19:27:52.234665   23563 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:52 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:52 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:52 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 477.
	Dec 02 19:27:52 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:52 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:52 functional-449836 kubelet[23644]: E1202 19:27:52.983586   23644 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:52 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:52 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (365.671043ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/StatusCmd (3.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect (2.49s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-449836 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1636: (dbg) Non-zero exit: kubectl --context functional-449836 create deployment hello-node-connect --image kicbase/echo-server: exit status 1 (56.490342ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1638: failed to create hello-node deployment with this command "kubectl --context functional-449836 create deployment hello-node-connect --image kicbase/echo-server": exit status 1.
functional_test.go:1608: service test failed - dumping debug information
functional_test.go:1609: -----------------------service failure post-mortem--------------------------------
functional_test.go:1612: (dbg) Run:  kubectl --context functional-449836 describe po hello-node-connect
functional_test.go:1612: (dbg) Non-zero exit: kubectl --context functional-449836 describe po hello-node-connect: exit status 1 (63.819743ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1614: "kubectl --context functional-449836 describe po hello-node-connect" failed: exit status 1
functional_test.go:1616: hello-node pod describe:
functional_test.go:1618: (dbg) Run:  kubectl --context functional-449836 logs -l app=hello-node-connect
functional_test.go:1618: (dbg) Non-zero exit: kubectl --context functional-449836 logs -l app=hello-node-connect: exit status 1 (63.155666ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1620: "kubectl --context functional-449836 logs -l app=hello-node-connect" failed: exit status 1
functional_test.go:1622: hello-node logs:
functional_test.go:1624: (dbg) Run:  kubectl --context functional-449836 describe svc hello-node-connect
functional_test.go:1624: (dbg) Non-zero exit: kubectl --context functional-449836 describe svc hello-node-connect: exit status 1 (78.239629ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test.go:1626: "kubectl --context functional-449836 describe svc hello-node-connect" failed: exit status 1
functional_test.go:1628: hello-node svc describe:
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (321.297679ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                             ARGS                                                                             │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ cache   │ functional-449836 cache reload                                                                                                                               │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ ssh     │ functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:3.1                                                                                                                             │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ cache   │ delete registry.k8s.io/pause:latest                                                                                                                          │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │ 02 Dec 25 19:13 UTC │
	│ kubectl │ functional-449836 kubectl -- --context functional-449836 get pods                                                                                            │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	│ start   │ -p functional-449836 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all                                                     │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:13 UTC │                     │
	│ config  │ functional-449836 config unset cpus                                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ cp      │ functional-449836 cp testdata/cp-test.txt /home/docker/cp-test.txt                                                                                           │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ config  │ functional-449836 config get cpus                                                                                                                            │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │                     │
	│ config  │ functional-449836 config set cpus 2                                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ config  │ functional-449836 config get cpus                                                                                                                            │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ config  │ functional-449836 config unset cpus                                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ ssh     │ functional-449836 ssh -n functional-449836 sudo cat /home/docker/cp-test.txt                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ config  │ functional-449836 config get cpus                                                                                                                            │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │                     │
	│ ssh     │ functional-449836 ssh echo hello                                                                                                                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ cp      │ functional-449836 cp functional-449836:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelCp3136267943/001/cp-test.txt │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ ssh     │ functional-449836 ssh cat /etc/hostname                                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ ssh     │ functional-449836 ssh -n functional-449836 sudo cat /home/docker/cp-test.txt                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ tunnel  │ functional-449836 tunnel --alsologtostderr                                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │                     │
	│ tunnel  │ functional-449836 tunnel --alsologtostderr                                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │                     │
	│ cp      │ functional-449836 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt                                                                                    │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ tunnel  │ functional-449836 tunnel --alsologtostderr                                                                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │                     │
	│ ssh     │ functional-449836 ssh -n functional-449836 sudo cat /tmp/does/not/exist/cp-test.txt                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:26 UTC │ 02 Dec 25 19:26 UTC │
	│ addons  │ functional-449836 addons list                                                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ addons  │ functional-449836 addons list -o json                                                                                                                        │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	└─────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:13:42
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:13:42.762704   54807 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:13:42.762827   54807 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:13:42.762831   54807 out.go:374] Setting ErrFile to fd 2...
	I1202 19:13:42.762834   54807 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:13:42.763078   54807 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:13:42.763410   54807 out.go:368] Setting JSON to false
	I1202 19:13:42.764228   54807 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":3359,"bootTime":1764699464,"procs":155,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:13:42.764287   54807 start.go:143] virtualization:  
	I1202 19:13:42.767748   54807 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:13:42.771595   54807 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:13:42.771638   54807 notify.go:221] Checking for updates...
	I1202 19:13:42.777727   54807 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:13:42.780738   54807 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:13:42.783655   54807 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:13:42.786554   54807 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:13:42.789556   54807 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:13:42.793178   54807 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:13:42.793273   54807 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:13:42.817932   54807 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:13:42.818037   54807 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:13:42.893670   54807 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-02 19:13:42.884370868 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:13:42.893764   54807 docker.go:319] overlay module found
	I1202 19:13:42.896766   54807 out.go:179] * Using the docker driver based on existing profile
	I1202 19:13:42.899559   54807 start.go:309] selected driver: docker
	I1202 19:13:42.899567   54807 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disa
bleCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:42.899671   54807 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:13:42.899770   54807 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:13:42.952802   54807 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:55 SystemTime:2025-12-02 19:13:42.943962699 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:13:42.953225   54807 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1202 19:13:42.953247   54807 cni.go:84] Creating CNI manager for ""
	I1202 19:13:42.953303   54807 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:13:42.953342   54807 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:42.958183   54807 out.go:179] * Starting "functional-449836" primary control-plane node in "functional-449836" cluster
	I1202 19:13:42.960983   54807 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 19:13:42.963884   54807 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 19:13:42.968058   54807 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:13:42.968252   54807 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 19:13:42.989666   54807 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 19:13:42.989677   54807 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 19:13:43.031045   54807 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 19:13:43.240107   54807 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 19:13:43.240267   54807 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/config.json ...
	I1202 19:13:43.240445   54807 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240540   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 19:13:43.240557   54807 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 118.031µs
	I1202 19:13:43.240570   54807 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 19:13:43.240584   54807 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240616   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 19:13:43.240621   54807 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 38.835µs
	I1202 19:13:43.240626   54807 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 19:13:43.240809   54807 cache.go:243] Successfully downloaded all kic artifacts
	I1202 19:13:43.240835   54807 start.go:360] acquireMachinesLock for functional-449836: {Name:mk8999fdfa518fc15358d07431fe9bec286a035e Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.240864   54807 start.go:364] duration metric: took 20.397µs to acquireMachinesLock for "functional-449836"
	I1202 19:13:43.240875   54807 start.go:96] Skipping create...Using existing machine configuration
	I1202 19:13:43.240879   54807 fix.go:54] fixHost starting: 
	I1202 19:13:43.241152   54807 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
	I1202 19:13:43.241336   54807 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241393   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 19:13:43.241400   54807 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 69.973µs
	I1202 19:13:43.241406   54807 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 19:13:43.241456   54807 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241496   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 19:13:43.241501   54807 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 46.589µs
	I1202 19:13:43.241506   54807 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 19:13:43.241515   54807 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241539   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 19:13:43.241543   54807 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 29.662µs
	I1202 19:13:43.241548   54807 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 19:13:43.241556   54807 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241581   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 19:13:43.241585   54807 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 29.85µs
	I1202 19:13:43.241589   54807 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 19:13:43.241615   54807 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241641   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 19:13:43.241629   54807 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:13:43.241645   54807 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 32.345µs
	I1202 19:13:43.241650   54807 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 19:13:43.241693   54807 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 19:13:43.241700   54807 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 86.392µs
	I1202 19:13:43.241706   54807 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 19:13:43.241720   54807 cache.go:87] Successfully saved all images to host disk.
	I1202 19:13:43.258350   54807 fix.go:112] recreateIfNeeded on functional-449836: state=Running err=<nil>
	W1202 19:13:43.258376   54807 fix.go:138] unexpected machine state, will restart: <nil>
	I1202 19:13:43.261600   54807 out.go:252] * Updating the running docker "functional-449836" container ...
	I1202 19:13:43.261627   54807 machine.go:94] provisionDockerMachine start ...
	I1202 19:13:43.261705   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.278805   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.279129   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.279134   54807 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 19:13:43.427938   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:13:43.427951   54807 ubuntu.go:182] provisioning hostname "functional-449836"
	I1202 19:13:43.428028   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.447456   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.447752   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.447759   54807 main.go:143] libmachine: About to run SSH command:
	sudo hostname functional-449836 && echo "functional-449836" | sudo tee /etc/hostname
	I1202 19:13:43.605729   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: functional-449836
	
	I1202 19:13:43.605800   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.624976   54807 main.go:143] libmachine: Using SSH client type: native
	I1202 19:13:43.625283   54807 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 32788 <nil> <nil>}
	I1202 19:13:43.625296   54807 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sfunctional-449836' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 functional-449836/g' /etc/hosts;
				else 
					echo '127.0.1.1 functional-449836' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 19:13:43.772540   54807 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 19:13:43.772562   54807 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 19:13:43.772595   54807 ubuntu.go:190] setting up certificates
	I1202 19:13:43.772604   54807 provision.go:84] configureAuth start
	I1202 19:13:43.772671   54807 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:13:43.790248   54807 provision.go:143] copyHostCerts
	I1202 19:13:43.790316   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 19:13:43.790328   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:13:43.790400   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 19:13:43.790504   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 19:13:43.790515   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:13:43.790538   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 19:13:43.790586   54807 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 19:13:43.790589   54807 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:13:43.790610   54807 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 19:13:43.790652   54807 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.functional-449836 san=[127.0.0.1 192.168.49.2 functional-449836 localhost minikube]
	I1202 19:13:43.836362   54807 provision.go:177] copyRemoteCerts
	I1202 19:13:43.836414   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 19:13:43.836453   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:43.856436   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:43.960942   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 19:13:43.990337   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I1202 19:13:44.010316   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 19:13:44.028611   54807 provision.go:87] duration metric: took 255.971492ms to configureAuth
	I1202 19:13:44.028629   54807 ubuntu.go:206] setting minikube options for container-runtime
	I1202 19:13:44.028821   54807 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:13:44.028827   54807 machine.go:97] duration metric: took 767.195405ms to provisionDockerMachine
	I1202 19:13:44.028833   54807 start.go:293] postStartSetup for "functional-449836" (driver="docker")
	I1202 19:13:44.028844   54807 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 19:13:44.028890   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 19:13:44.028937   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.046629   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.156467   54807 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 19:13:44.159958   54807 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 19:13:44.159979   54807 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 19:13:44.159992   54807 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 19:13:44.160053   54807 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 19:13:44.160131   54807 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 19:13:44.160205   54807 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts -> hosts in /etc/test/nested/copy/4435
	I1202 19:13:44.160247   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs /etc/test/nested/copy/4435
	I1202 19:13:44.167846   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:13:44.185707   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts --> /etc/test/nested/copy/4435/hosts (40 bytes)
	I1202 19:13:44.203573   54807 start.go:296] duration metric: took 174.725487ms for postStartSetup
	I1202 19:13:44.203665   54807 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:13:44.203703   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.221082   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.321354   54807 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 19:13:44.325951   54807 fix.go:56] duration metric: took 1.085065634s for fixHost
	I1202 19:13:44.325966   54807 start.go:83] releasing machines lock for "functional-449836", held for 1.08509619s
	I1202 19:13:44.326041   54807 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" functional-449836
	I1202 19:13:44.343136   54807 ssh_runner.go:195] Run: cat /version.json
	I1202 19:13:44.343179   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.343439   54807 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 19:13:44.343497   54807 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
	I1202 19:13:44.361296   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.363895   54807 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
	I1202 19:13:44.464126   54807 ssh_runner.go:195] Run: systemctl --version
	I1202 19:13:44.557588   54807 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1202 19:13:44.561902   54807 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 19:13:44.561962   54807 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 19:13:44.569598   54807 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1202 19:13:44.569611   54807 start.go:496] detecting cgroup driver to use...
	I1202 19:13:44.569649   54807 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 19:13:44.569710   54807 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 19:13:44.587349   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 19:13:44.609174   54807 docker.go:218] disabling cri-docker service (if available) ...
	I1202 19:13:44.609228   54807 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 19:13:44.629149   54807 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 19:13:44.643983   54807 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 19:13:44.758878   54807 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 19:13:44.879635   54807 docker.go:234] disabling docker service ...
	I1202 19:13:44.879691   54807 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 19:13:44.895449   54807 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 19:13:44.908858   54807 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 19:13:45.045971   54807 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 19:13:45.189406   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 19:13:45.215003   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 19:13:45.239052   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 19:13:45.252425   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 19:13:45.264818   54807 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 19:13:45.264881   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 19:13:45.275398   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:13:45.286201   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 19:13:45.295830   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:13:45.307108   54807 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 19:13:45.315922   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 19:13:45.325735   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 19:13:45.336853   54807 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 19:13:45.346391   54807 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 19:13:45.354212   54807 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 19:13:45.361966   54807 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:13:45.496442   54807 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 19:13:45.617692   54807 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 19:13:45.617755   54807 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 19:13:45.622143   54807 start.go:564] Will wait 60s for crictl version
	I1202 19:13:45.622212   54807 ssh_runner.go:195] Run: which crictl
	I1202 19:13:45.626172   54807 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 19:13:45.650746   54807 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 19:13:45.650812   54807 ssh_runner.go:195] Run: containerd --version
	I1202 19:13:45.670031   54807 ssh_runner.go:195] Run: containerd --version
	I1202 19:13:45.697284   54807 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 19:13:45.700249   54807 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 19:13:45.717142   54807 ssh_runner.go:195] Run: grep 192.168.49.1	host.minikube.internal$ /etc/hosts
	I1202 19:13:45.724151   54807 out.go:179]   - apiserver.enable-admission-plugins=NamespaceAutoProvision
	I1202 19:13:45.727141   54807 kubeadm.go:884] updating cluster {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikub
eCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker Bina
ryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 19:13:45.727279   54807 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:13:45.727346   54807 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 19:13:45.751767   54807 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 19:13:45.751786   54807 cache_images.go:86] Images are preloaded, skipping loading
	I1202 19:13:45.751792   54807 kubeadm.go:935] updating node { 192.168.49.2 8441 v1.35.0-beta.0 containerd true true} ...
	I1202 19:13:45.751903   54807 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=functional-449836 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.49.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 19:13:45.751976   54807 ssh_runner.go:195] Run: sudo crictl info
	I1202 19:13:45.777030   54807 extraconfig.go:125] Overwriting default enable-admission-plugins=NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota with user provided enable-admission-plugins=NamespaceAutoProvision for component apiserver
	I1202 19:13:45.777052   54807 cni.go:84] Creating CNI manager for ""
	I1202 19:13:45.777060   54807 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:13:45.777073   54807 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 19:13:45.777095   54807 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.49.2 APIServerPort:8441 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:functional-449836 NodeName:functional-449836 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceAutoProvision] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.49.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.49.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false Kubel
etConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 19:13:45.777203   54807 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.49.2
	  bindPort: 8441
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "functional-449836"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.49.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceAutoProvision"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8441
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 19:13:45.777274   54807 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 19:13:45.785000   54807 binaries.go:51] Found k8s binaries, skipping transfer
	I1202 19:13:45.785061   54807 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 19:13:45.792592   54807 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (328 bytes)
	I1202 19:13:45.805336   54807 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 19:13:45.818427   54807 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2087 bytes)
	I1202 19:13:45.830990   54807 ssh_runner.go:195] Run: grep 192.168.49.2	control-plane.minikube.internal$ /etc/hosts
	I1202 19:13:45.834935   54807 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:13:45.945402   54807 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 19:13:46.172299   54807 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836 for IP: 192.168.49.2
	I1202 19:13:46.172311   54807 certs.go:195] generating shared ca certs ...
	I1202 19:13:46.172340   54807 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 19:13:46.172494   54807 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 19:13:46.172550   54807 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 19:13:46.172557   54807 certs.go:257] generating profile certs ...
	I1202 19:13:46.172651   54807 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.key
	I1202 19:13:46.172725   54807 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key.a65b71da
	I1202 19:13:46.172770   54807 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key
	I1202 19:13:46.172876   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 19:13:46.172906   54807 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 19:13:46.172913   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 19:13:46.172944   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 19:13:46.172967   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 19:13:46.172992   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 19:13:46.173034   54807 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:13:46.174236   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 19:13:46.206005   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 19:13:46.223256   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 19:13:46.250390   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 19:13:46.270550   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I1202 19:13:46.289153   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 19:13:46.307175   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 19:13:46.325652   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 19:13:46.343823   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 19:13:46.361647   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 19:13:46.379597   54807 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 19:13:46.397750   54807 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 19:13:46.411087   54807 ssh_runner.go:195] Run: openssl version
	I1202 19:13:46.418777   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 19:13:46.427262   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.431022   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.431093   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 19:13:46.473995   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 19:13:46.482092   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 19:13:46.490432   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.494266   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.494320   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 19:13:46.535125   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 19:13:46.543277   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 19:13:46.551769   54807 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.555743   54807 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.555797   54807 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 19:13:46.597778   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 19:13:46.605874   54807 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 19:13:46.609733   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1202 19:13:46.652482   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1202 19:13:46.693214   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1202 19:13:46.734654   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1202 19:13:46.775729   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1202 19:13:46.821319   54807 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1202 19:13:46.862299   54807 kubeadm.go:401] StartCluster: {Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA
APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryM
irror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:13:46.862398   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 19:13:46.862468   54807 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:13:46.891099   54807 cri.go:89] found id: ""
	I1202 19:13:46.891159   54807 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 19:13:46.898813   54807 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1202 19:13:46.898821   54807 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1202 19:13:46.898874   54807 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1202 19:13:46.906272   54807 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:46.906775   54807 kubeconfig.go:125] found "functional-449836" server: "https://192.168.49.2:8441"
	I1202 19:13:46.908038   54807 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1202 19:13:46.915724   54807 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-02 18:59:11.521818114 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-02 19:13:45.826341203 +0000
	@@ -24,7 +24,7 @@
	   certSANs: ["127.0.0.1", "localhost", "192.168.49.2"]
	   extraArgs:
	     - name: "enable-admission-plugins"
	-      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+      value: "NamespaceAutoProvision"
	 controllerManager:
	   extraArgs:
	     - name: "allocate-node-cidrs"
	
	-- /stdout --
	I1202 19:13:46.915744   54807 kubeadm.go:1161] stopping kube-system containers ...
	I1202 19:13:46.915757   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1202 19:13:46.915816   54807 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 19:13:46.943936   54807 cri.go:89] found id: ""
	I1202 19:13:46.944009   54807 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1202 19:13:46.961843   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:13:46.971074   54807 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5635 Dec  2 19:03 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5640 Dec  2 19:03 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 5672 Dec  2 19:03 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5584 Dec  2 19:03 /etc/kubernetes/scheduler.conf
	
	I1202 19:13:46.971137   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:13:46.979452   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:13:46.987399   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:46.987454   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:13:46.994869   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:13:47.002498   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:47.002560   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:13:47.010116   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:13:47.017891   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 19:13:47.017946   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:13:47.025383   54807 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 19:13:47.033423   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:47.076377   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.395417   54807 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.319015091s)
	I1202 19:13:48.395495   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.604942   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.668399   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1202 19:13:48.712382   54807 api_server.go:52] waiting for apiserver process to appear ...
	I1202 19:13:48.712452   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:49.212900   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:49.713354   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:50.213340   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:50.713260   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:51.212621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:51.713471   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:52.213212   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:52.712687   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:53.212572   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:53.713310   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:54.212640   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:54.712595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:55.213133   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:55.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:56.212595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:56.713443   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:57.213230   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:57.713055   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:58.213071   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:58.712680   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:59.213352   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:13:59.712654   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:00.213647   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:00.712569   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:01.212673   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:01.713030   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:02.212581   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:02.712631   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:03.213287   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:03.712572   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:04.213500   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:04.713557   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:05.213523   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:05.713480   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:06.212772   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:06.713553   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:07.213309   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:07.712616   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:08.212729   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:08.713403   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:09.212625   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:09.713385   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:10.212662   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:10.712619   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:11.213505   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:11.712640   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:12.213396   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:12.712571   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:13.212963   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:13.713403   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:14.213457   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:14.712620   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:15.213335   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:15.713379   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:16.212612   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:16.712624   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:17.212573   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:17.713394   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:18.213294   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:18.713516   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:19.213531   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:19.713309   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:20.212591   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:20.713575   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:21.213560   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:21.713513   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:22.213219   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:22.712620   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:23.213273   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:23.713477   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:24.213364   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:24.712581   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:25.212597   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:25.713554   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:26.213205   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:26.712517   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:27.213345   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:27.712602   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:28.212602   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:28.713533   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:29.213188   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:29.713102   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:30.212626   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:30.712732   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:31.212615   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:31.713473   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:32.212590   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:32.712645   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:33.213398   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:33.713081   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:34.213498   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:34.712625   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:35.213560   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:35.712634   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:36.213370   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:36.712576   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:37.213006   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:37.712656   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:38.212594   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:38.713448   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:39.213442   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:39.712577   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:40.212621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:40.713516   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:41.212756   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:41.712509   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:42.215715   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:42.712573   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:43.212604   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:43.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:44.213283   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:44.712621   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:45.213407   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:45.712947   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:46.213239   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:46.712626   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:47.213260   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:47.713210   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:48.212639   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:48.713264   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:48.713347   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:48.742977   54807 cri.go:89] found id: ""
	I1202 19:14:48.742990   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.742997   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:48.743002   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:48.743061   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:48.767865   54807 cri.go:89] found id: ""
	I1202 19:14:48.767879   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.767886   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:48.767892   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:48.767949   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:48.792531   54807 cri.go:89] found id: ""
	I1202 19:14:48.792544   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.792560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:48.792566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:48.792624   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:48.821644   54807 cri.go:89] found id: ""
	I1202 19:14:48.821657   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.821665   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:48.821670   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:48.821729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:48.847227   54807 cri.go:89] found id: ""
	I1202 19:14:48.847246   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.847253   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:48.847258   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:48.847318   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:48.872064   54807 cri.go:89] found id: ""
	I1202 19:14:48.872084   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.872091   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:48.872097   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:48.872155   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:48.895905   54807 cri.go:89] found id: ""
	I1202 19:14:48.895919   54807 logs.go:282] 0 containers: []
	W1202 19:14:48.895925   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:48.895933   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:48.895945   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:48.962492   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:48.954400   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.954981   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.956769   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.957381   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.959046   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:48.954400   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.954981   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.956769   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.957381   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:48.959046   11315 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:48.962515   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:48.962526   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:49.026861   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:49.026881   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:49.059991   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:49.060006   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:49.119340   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:49.119357   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:51.632315   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:51.642501   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:51.642560   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:51.669041   54807 cri.go:89] found id: ""
	I1202 19:14:51.669054   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.669061   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:51.669086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:51.669150   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:51.698828   54807 cri.go:89] found id: ""
	I1202 19:14:51.698857   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.698864   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:51.698870   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:51.698939   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:51.739419   54807 cri.go:89] found id: ""
	I1202 19:14:51.739446   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.739454   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:51.739459   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:51.739532   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:51.764613   54807 cri.go:89] found id: ""
	I1202 19:14:51.764627   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.764633   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:51.764639   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:51.764698   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:51.790197   54807 cri.go:89] found id: ""
	I1202 19:14:51.790211   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.790217   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:51.790222   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:51.790281   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:51.824131   54807 cri.go:89] found id: ""
	I1202 19:14:51.824144   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.824151   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:51.824170   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:51.824228   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:51.848893   54807 cri.go:89] found id: ""
	I1202 19:14:51.848907   54807 logs.go:282] 0 containers: []
	W1202 19:14:51.848914   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:51.848922   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:51.848932   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:51.877099   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:51.877114   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:51.933539   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:51.933560   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:51.944309   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:51.944346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:52.014156   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:52.005976   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.006753   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.008549   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.009096   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.010706   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:52.005976   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.006753   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.008549   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.009096   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:52.010706   11437 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:52.014167   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:52.014178   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:54.578451   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:54.588802   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:54.588862   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:54.613620   54807 cri.go:89] found id: ""
	I1202 19:14:54.613633   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.613640   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:54.613646   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:54.613704   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:54.637471   54807 cri.go:89] found id: ""
	I1202 19:14:54.637486   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.637498   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:54.637503   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:54.637561   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:54.662053   54807 cri.go:89] found id: ""
	I1202 19:14:54.662066   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.662073   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:54.662079   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:54.662135   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:54.694901   54807 cri.go:89] found id: ""
	I1202 19:14:54.694916   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.694923   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:54.694928   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:54.694998   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:54.728487   54807 cri.go:89] found id: ""
	I1202 19:14:54.728500   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.728507   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:54.728512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:54.728569   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:54.756786   54807 cri.go:89] found id: ""
	I1202 19:14:54.756800   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.756806   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:54.756812   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:54.756868   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:54.782187   54807 cri.go:89] found id: ""
	I1202 19:14:54.782200   54807 logs.go:282] 0 containers: []
	W1202 19:14:54.782212   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:54.782220   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:54.782231   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:54.846497   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:54.838849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.839509   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.840990   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.841320   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.842849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:54.838849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.839509   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.840990   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.841320   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:54.842849   11526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:54.846510   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:54.846521   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:54.909600   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:54.909620   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:54.943132   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:54.943150   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:55.006561   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:55.006581   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:14:57.519164   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:14:57.529445   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:14:57.529506   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:14:57.554155   54807 cri.go:89] found id: ""
	I1202 19:14:57.554168   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.554176   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:14:57.554181   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:14:57.554240   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:14:57.579453   54807 cri.go:89] found id: ""
	I1202 19:14:57.579468   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.579474   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:14:57.579480   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:14:57.579537   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:14:57.608139   54807 cri.go:89] found id: ""
	I1202 19:14:57.608152   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.608160   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:14:57.608165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:14:57.608224   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:14:57.632309   54807 cri.go:89] found id: ""
	I1202 19:14:57.632360   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.632368   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:14:57.632374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:14:57.632434   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:14:57.657933   54807 cri.go:89] found id: ""
	I1202 19:14:57.657947   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.657954   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:14:57.657959   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:14:57.658019   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:14:57.698982   54807 cri.go:89] found id: ""
	I1202 19:14:57.698996   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.699002   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:14:57.699008   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:14:57.699105   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:14:57.738205   54807 cri.go:89] found id: ""
	I1202 19:14:57.738219   54807 logs.go:282] 0 containers: []
	W1202 19:14:57.738226   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:14:57.738234   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:14:57.738245   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:14:57.802193   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:14:57.794304   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.794844   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796409   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796881   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.798306   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:14:57.794304   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.794844   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796409   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.796881   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:14:57.798306   11633 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:14:57.802204   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:14:57.802215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:14:57.865638   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:14:57.865657   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:14:57.900835   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:14:57.900850   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:14:57.958121   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:14:57.958139   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:00.502580   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:00.515602   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:00.515692   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:00.553262   54807 cri.go:89] found id: ""
	I1202 19:15:00.553290   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.553298   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:00.553304   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:00.553372   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:00.592663   54807 cri.go:89] found id: ""
	I1202 19:15:00.592678   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.592686   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:00.592691   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:00.592782   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:00.624403   54807 cri.go:89] found id: ""
	I1202 19:15:00.624423   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.624431   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:00.624438   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:00.624521   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:00.659265   54807 cri.go:89] found id: ""
	I1202 19:15:00.659280   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.659288   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:00.659294   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:00.659383   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:00.695489   54807 cri.go:89] found id: ""
	I1202 19:15:00.695508   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.695517   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:00.695523   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:00.695592   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:00.732577   54807 cri.go:89] found id: ""
	I1202 19:15:00.732592   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.732600   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:00.732607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:00.732696   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:00.767521   54807 cri.go:89] found id: ""
	I1202 19:15:00.767538   54807 logs.go:282] 0 containers: []
	W1202 19:15:00.767546   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:00.767555   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:00.767566   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:00.829818   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:00.829837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:00.842792   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:00.842810   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:00.919161   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:00.909398   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.910484   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.912564   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.913381   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.915298   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:00.909398   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.910484   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.912564   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.913381   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:00.915298   11742 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:00.919174   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:00.919193   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:00.985798   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:00.985819   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:03.521258   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:03.531745   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:03.531810   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:03.556245   54807 cri.go:89] found id: ""
	I1202 19:15:03.556258   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.556265   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:03.556271   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:03.556355   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:03.580774   54807 cri.go:89] found id: ""
	I1202 19:15:03.580787   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.580794   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:03.580799   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:03.580857   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:03.606247   54807 cri.go:89] found id: ""
	I1202 19:15:03.606261   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.606269   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:03.606274   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:03.606335   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:03.631169   54807 cri.go:89] found id: ""
	I1202 19:15:03.631182   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.631189   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:03.631195   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:03.631252   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:03.657089   54807 cri.go:89] found id: ""
	I1202 19:15:03.657111   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.657118   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:03.657124   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:03.657183   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:03.699997   54807 cri.go:89] found id: ""
	I1202 19:15:03.700010   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.700017   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:03.700023   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:03.700081   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:03.725717   54807 cri.go:89] found id: ""
	I1202 19:15:03.725731   54807 logs.go:282] 0 containers: []
	W1202 19:15:03.725738   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:03.725746   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:03.725755   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:03.793907   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:03.793928   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:03.822178   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:03.822199   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:03.881429   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:03.881453   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:03.892554   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:03.892569   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:03.960792   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:03.952836   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.953779   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955515   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955829   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.957393   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:03.952836   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.953779   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955515   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.955829   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:03.957393   11862 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:06.461036   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:06.471459   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:06.471519   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:06.500164   54807 cri.go:89] found id: ""
	I1202 19:15:06.500178   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.500184   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:06.500190   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:06.500253   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:06.526532   54807 cri.go:89] found id: ""
	I1202 19:15:06.526545   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.526552   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:06.526558   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:06.526616   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:06.551534   54807 cri.go:89] found id: ""
	I1202 19:15:06.551553   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.551560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:06.551566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:06.551628   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:06.577486   54807 cri.go:89] found id: ""
	I1202 19:15:06.577500   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.577506   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:06.577512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:06.577570   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:06.607506   54807 cri.go:89] found id: ""
	I1202 19:15:06.607520   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.607529   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:06.607535   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:06.607663   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:06.632779   54807 cri.go:89] found id: ""
	I1202 19:15:06.632792   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.632799   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:06.632805   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:06.632866   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:06.656916   54807 cri.go:89] found id: ""
	I1202 19:15:06.656928   54807 logs.go:282] 0 containers: []
	W1202 19:15:06.656936   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:06.656943   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:06.656953   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:06.721178   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:06.721197   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:06.733421   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:06.733437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:06.806706   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:06.798437   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.799136   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.800799   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.801507   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.803118   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:06.798437   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.799136   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.800799   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.801507   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:06.803118   11957 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:06.806717   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:06.806728   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:06.870452   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:06.870471   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:09.403297   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:09.414259   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:09.414319   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:09.442090   54807 cri.go:89] found id: ""
	I1202 19:15:09.442103   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.442110   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:09.442115   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:09.442175   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:09.471784   54807 cri.go:89] found id: ""
	I1202 19:15:09.471797   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.471804   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:09.471809   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:09.471887   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:09.496688   54807 cri.go:89] found id: ""
	I1202 19:15:09.496701   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.496708   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:09.496714   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:09.496773   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:09.522932   54807 cri.go:89] found id: ""
	I1202 19:15:09.522946   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.522952   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:09.522957   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:09.523018   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:09.550254   54807 cri.go:89] found id: ""
	I1202 19:15:09.550268   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.550275   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:09.550280   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:09.550341   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:09.578955   54807 cri.go:89] found id: ""
	I1202 19:15:09.578968   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.578975   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:09.578980   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:09.579041   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:09.603797   54807 cri.go:89] found id: ""
	I1202 19:15:09.603812   54807 logs.go:282] 0 containers: []
	W1202 19:15:09.603819   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:09.603827   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:09.603837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:09.660195   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:09.660215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:09.671581   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:09.671596   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:09.755982   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:09.748446   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.749204   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.750900   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.751203   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.752674   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:09.748446   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.749204   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.750900   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.751203   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:09.752674   12065 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:09.755993   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:09.756013   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:09.820958   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:09.820977   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:12.349982   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:12.359890   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:12.359953   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:12.387716   54807 cri.go:89] found id: ""
	I1202 19:15:12.387729   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.387736   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:12.387741   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:12.387802   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:12.413168   54807 cri.go:89] found id: ""
	I1202 19:15:12.413182   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.413188   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:12.413194   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:12.413262   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:12.441234   54807 cri.go:89] found id: ""
	I1202 19:15:12.441247   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.441253   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:12.441262   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:12.441321   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:12.465660   54807 cri.go:89] found id: ""
	I1202 19:15:12.465673   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.465680   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:12.465689   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:12.465747   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:12.489519   54807 cri.go:89] found id: ""
	I1202 19:15:12.489532   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.489540   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:12.489545   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:12.489605   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:12.514756   54807 cri.go:89] found id: ""
	I1202 19:15:12.514770   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.514777   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:12.514782   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:12.514843   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:12.538845   54807 cri.go:89] found id: ""
	I1202 19:15:12.538858   54807 logs.go:282] 0 containers: []
	W1202 19:15:12.538865   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:12.538872   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:12.538884   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:12.549453   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:12.549477   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:12.616294   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:12.608411   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.609095   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.610814   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.611359   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.612794   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:12.608411   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.609095   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.610814   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.611359   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:12.612794   12164 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:12.616304   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:12.616315   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:12.679579   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:12.679598   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:12.712483   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:12.712499   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:15.277003   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:15.287413   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:15.287496   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:15.313100   54807 cri.go:89] found id: ""
	I1202 19:15:15.313113   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.313120   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:15.313135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:15.313194   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:15.339367   54807 cri.go:89] found id: ""
	I1202 19:15:15.339381   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.339387   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:15.339393   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:15.339463   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:15.364247   54807 cri.go:89] found id: ""
	I1202 19:15:15.364270   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.364277   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:15.364283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:15.364393   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:15.389379   54807 cri.go:89] found id: ""
	I1202 19:15:15.389393   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.389401   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:15.389412   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:15.389472   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:15.414364   54807 cri.go:89] found id: ""
	I1202 19:15:15.414378   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.414386   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:15.414391   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:15.414455   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:15.438995   54807 cri.go:89] found id: ""
	I1202 19:15:15.439009   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.439024   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:15.439030   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:15.439097   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:15.467973   54807 cri.go:89] found id: ""
	I1202 19:15:15.467986   54807 logs.go:282] 0 containers: []
	W1202 19:15:15.467993   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:15.468001   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:15.468010   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:15.534212   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:15.526543   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.527142   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.528724   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.529277   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.530859   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:15.526543   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.527142   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.528724   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.529277   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:15.530859   12263 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:15.534222   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:15.534233   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:15.602898   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:15.602917   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:15.634225   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:15.634242   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:15.693229   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:15.693247   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:18.205585   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:18.217019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:18.217080   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:18.243139   54807 cri.go:89] found id: ""
	I1202 19:15:18.243153   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.243160   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:18.243176   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:18.243234   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:18.266826   54807 cri.go:89] found id: ""
	I1202 19:15:18.266839   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.266846   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:18.266851   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:18.266911   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:18.291760   54807 cri.go:89] found id: ""
	I1202 19:15:18.291773   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.291781   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:18.291795   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:18.291853   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:18.315881   54807 cri.go:89] found id: ""
	I1202 19:15:18.315895   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.315902   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:18.315907   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:18.315963   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:18.354620   54807 cri.go:89] found id: ""
	I1202 19:15:18.354633   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.354640   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:18.354649   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:18.354708   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:18.378919   54807 cri.go:89] found id: ""
	I1202 19:15:18.378932   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.378939   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:18.378945   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:18.379003   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:18.403461   54807 cri.go:89] found id: ""
	I1202 19:15:18.403474   54807 logs.go:282] 0 containers: []
	W1202 19:15:18.403482   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:18.403489   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:18.403499   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:18.460043   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:18.460062   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:18.471326   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:18.471343   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:18.533325   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:18.525005   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.525603   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527360   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527971   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.529742   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:18.525005   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.525603   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527360   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.527971   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:18.529742   12371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:18.533335   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:18.533346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:18.595843   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:18.595862   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:21.128472   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:21.138623   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:21.138683   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:21.163008   54807 cri.go:89] found id: ""
	I1202 19:15:21.163021   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.163028   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:21.163039   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:21.163096   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:21.186917   54807 cri.go:89] found id: ""
	I1202 19:15:21.186930   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.186937   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:21.186942   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:21.187000   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:21.212853   54807 cri.go:89] found id: ""
	I1202 19:15:21.212866   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.212873   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:21.212878   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:21.212937   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:21.240682   54807 cri.go:89] found id: ""
	I1202 19:15:21.240695   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.240703   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:21.240708   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:21.240765   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:21.264693   54807 cri.go:89] found id: ""
	I1202 19:15:21.264706   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.264713   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:21.264718   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:21.264778   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:21.288193   54807 cri.go:89] found id: ""
	I1202 19:15:21.288207   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.288214   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:21.288219   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:21.288278   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:21.313950   54807 cri.go:89] found id: ""
	I1202 19:15:21.313964   54807 logs.go:282] 0 containers: []
	W1202 19:15:21.313971   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:21.313979   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:21.313990   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:21.324612   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:21.324626   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:21.388157   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:21.380313   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.381141   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.382761   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.383081   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.384783   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:21.380313   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.381141   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.382761   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.383081   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:21.384783   12478 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:21.388177   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:21.388188   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:21.451835   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:21.451853   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:21.480172   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:21.480187   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:24.037107   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:24.047291   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:24.047362   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:24.072397   54807 cri.go:89] found id: ""
	I1202 19:15:24.072411   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.072418   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:24.072424   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:24.072486   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:24.097793   54807 cri.go:89] found id: ""
	I1202 19:15:24.097807   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.097814   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:24.097819   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:24.097879   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:24.122934   54807 cri.go:89] found id: ""
	I1202 19:15:24.122947   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.122954   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:24.122960   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:24.123020   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:24.147849   54807 cri.go:89] found id: ""
	I1202 19:15:24.147863   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.147869   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:24.147875   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:24.147935   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:24.172919   54807 cri.go:89] found id: ""
	I1202 19:15:24.172932   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.172939   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:24.172944   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:24.173004   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:24.197266   54807 cri.go:89] found id: ""
	I1202 19:15:24.197280   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.197287   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:24.197293   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:24.197351   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:24.222541   54807 cri.go:89] found id: ""
	I1202 19:15:24.222555   54807 logs.go:282] 0 containers: []
	W1202 19:15:24.222562   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:24.222572   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:24.222582   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:24.278762   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:24.278784   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:24.289861   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:24.289877   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:24.353810   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:24.345889   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.346412   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.347992   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.348533   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.350299   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:24.345889   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.346412   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.347992   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.348533   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:24.350299   12589 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:24.353831   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:24.353842   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:24.416010   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:24.416029   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:26.947462   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:26.958975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:26.959033   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:26.992232   54807 cri.go:89] found id: ""
	I1202 19:15:26.992257   54807 logs.go:282] 0 containers: []
	W1202 19:15:26.992264   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:26.992270   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:26.992354   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:27.021036   54807 cri.go:89] found id: ""
	I1202 19:15:27.021049   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.021056   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:27.021062   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:27.021119   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:27.052008   54807 cri.go:89] found id: ""
	I1202 19:15:27.052022   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.052028   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:27.052034   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:27.052093   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:27.076184   54807 cri.go:89] found id: ""
	I1202 19:15:27.076197   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.076204   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:27.076209   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:27.076266   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:27.100296   54807 cri.go:89] found id: ""
	I1202 19:15:27.100308   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.100315   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:27.100355   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:27.100413   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:27.125762   54807 cri.go:89] found id: ""
	I1202 19:15:27.125776   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.125783   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:27.125788   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:27.125851   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:27.150224   54807 cri.go:89] found id: ""
	I1202 19:15:27.150237   54807 logs.go:282] 0 containers: []
	W1202 19:15:27.150244   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:27.150252   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:27.150262   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:27.178321   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:27.178338   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:27.233465   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:27.233484   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:27.244423   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:27.244437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:27.311220   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:27.303476   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.303903   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.305730   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.306227   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.307716   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:27.303476   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.303903   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.305730   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.306227   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:27.307716   12710 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:27.311235   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:27.311246   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:29.874091   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:29.884341   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:29.884402   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:29.909943   54807 cri.go:89] found id: ""
	I1202 19:15:29.909962   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.909970   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:29.909975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:29.910035   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:29.947534   54807 cri.go:89] found id: ""
	I1202 19:15:29.947547   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.947554   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:29.947559   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:29.947617   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:29.989319   54807 cri.go:89] found id: ""
	I1202 19:15:29.989335   54807 logs.go:282] 0 containers: []
	W1202 19:15:29.989343   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:29.989349   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:29.989414   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:30.038828   54807 cri.go:89] found id: ""
	I1202 19:15:30.038842   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.038850   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:30.038856   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:30.038932   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:30.067416   54807 cri.go:89] found id: ""
	I1202 19:15:30.067432   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.067440   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:30.067446   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:30.067509   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:30.094866   54807 cri.go:89] found id: ""
	I1202 19:15:30.094881   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.094888   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:30.094896   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:30.094958   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:30.120930   54807 cri.go:89] found id: ""
	I1202 19:15:30.120959   54807 logs.go:282] 0 containers: []
	W1202 19:15:30.120968   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:30.120977   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:30.120988   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:30.177165   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:30.177186   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:30.188251   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:30.188267   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:30.255176   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:30.247015   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.247757   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.249382   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.250104   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.251678   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:30.247015   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.247757   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.249382   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.250104   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:30.251678   12803 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:30.255194   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:30.255205   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:30.323165   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:30.323189   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:32.854201   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:32.864404   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:32.864467   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:32.890146   54807 cri.go:89] found id: ""
	I1202 19:15:32.890160   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.890166   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:32.890172   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:32.890239   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:32.915189   54807 cri.go:89] found id: ""
	I1202 19:15:32.915202   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.915210   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:32.915215   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:32.915286   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:32.952949   54807 cri.go:89] found id: ""
	I1202 19:15:32.952962   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.952969   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:32.952975   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:32.953031   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:32.986345   54807 cri.go:89] found id: ""
	I1202 19:15:32.986359   54807 logs.go:282] 0 containers: []
	W1202 19:15:32.986366   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:32.986371   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:32.986435   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:33.010880   54807 cri.go:89] found id: ""
	I1202 19:15:33.010894   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.010902   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:33.010908   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:33.010966   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:33.039327   54807 cri.go:89] found id: ""
	I1202 19:15:33.039341   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.039348   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:33.039354   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:33.039412   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:33.064437   54807 cri.go:89] found id: ""
	I1202 19:15:33.064463   54807 logs.go:282] 0 containers: []
	W1202 19:15:33.064470   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:33.064478   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:33.064488   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:33.120755   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:33.120773   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:33.132552   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:33.132575   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:33.199378   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:33.191204   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.191873   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.193517   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.194121   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.195812   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:33.191204   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.191873   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.193517   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.194121   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:33.195812   12907 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:33.199389   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:33.199401   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:33.266899   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:33.266918   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:35.796024   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:35.807086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:35.807146   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:35.839365   54807 cri.go:89] found id: ""
	I1202 19:15:35.839378   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.839394   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:35.839400   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:35.839469   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:35.872371   54807 cri.go:89] found id: ""
	I1202 19:15:35.872385   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.872393   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:35.872398   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:35.872467   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:35.901242   54807 cri.go:89] found id: ""
	I1202 19:15:35.901255   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.901262   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:35.901268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:35.901326   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:35.936195   54807 cri.go:89] found id: ""
	I1202 19:15:35.936209   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.936215   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:35.936221   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:35.936282   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:35.965129   54807 cri.go:89] found id: ""
	I1202 19:15:35.965145   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.965153   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:35.965159   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:35.966675   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:35.998286   54807 cri.go:89] found id: ""
	I1202 19:15:35.998299   54807 logs.go:282] 0 containers: []
	W1202 19:15:35.998306   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:35.998311   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:35.998371   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:36.024787   54807 cri.go:89] found id: ""
	I1202 19:15:36.024800   54807 logs.go:282] 0 containers: []
	W1202 19:15:36.024812   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:36.024820   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:36.024829   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:36.081130   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:36.081146   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:36.092692   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:36.092714   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:36.154814   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:36.146918   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.147704   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149430   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149884   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.151372   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:36.146918   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.147704   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149430   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.149884   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:36.151372   13010 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:36.154824   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:36.154837   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:36.218034   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:36.218052   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:38.748085   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:38.758270   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:38.758328   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:38.786304   54807 cri.go:89] found id: ""
	I1202 19:15:38.786317   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.786325   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:38.786330   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:38.786389   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:38.811113   54807 cri.go:89] found id: ""
	I1202 19:15:38.811126   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.811134   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:38.811139   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:38.811223   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:38.836191   54807 cri.go:89] found id: ""
	I1202 19:15:38.836207   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.836214   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:38.836219   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:38.836278   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:38.860383   54807 cri.go:89] found id: ""
	I1202 19:15:38.860396   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.860403   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:38.860410   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:38.860469   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:38.887750   54807 cri.go:89] found id: ""
	I1202 19:15:38.887764   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.887770   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:38.887775   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:38.887834   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:38.914103   54807 cri.go:89] found id: ""
	I1202 19:15:38.914116   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.914123   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:38.914128   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:38.914184   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:38.950405   54807 cri.go:89] found id: ""
	I1202 19:15:38.950418   54807 logs.go:282] 0 containers: []
	W1202 19:15:38.950425   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:38.950433   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:38.950442   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:39.016206   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:39.016225   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:39.026699   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:39.026714   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:39.090183   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:39.082441   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.083071   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.084892   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.085258   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.086741   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:39.082441   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.083071   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.084892   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.085258   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:39.086741   13115 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:39.090195   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:39.090206   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:39.151533   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:39.151551   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:41.681058   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:41.691353   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:41.691417   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:41.716684   54807 cri.go:89] found id: ""
	I1202 19:15:41.716697   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.716704   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:41.716710   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:41.716768   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:41.742096   54807 cri.go:89] found id: ""
	I1202 19:15:41.742110   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.742117   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:41.742122   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:41.742182   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:41.766652   54807 cri.go:89] found id: ""
	I1202 19:15:41.766665   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.766672   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:41.766678   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:41.766741   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:41.791517   54807 cri.go:89] found id: ""
	I1202 19:15:41.791531   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.791538   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:41.791544   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:41.791600   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:41.817700   54807 cri.go:89] found id: ""
	I1202 19:15:41.817713   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.817720   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:41.817725   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:41.817786   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:41.846078   54807 cri.go:89] found id: ""
	I1202 19:15:41.846092   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.846099   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:41.846104   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:41.846161   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:41.874235   54807 cri.go:89] found id: ""
	I1202 19:15:41.874249   54807 logs.go:282] 0 containers: []
	W1202 19:15:41.874258   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:41.874268   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:41.874278   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:41.942286   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:41.942307   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:41.989723   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:41.989740   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:42.047707   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:42.047728   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:42.061053   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:42.061073   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:42.138885   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:42.129369   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.130125   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.131195   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.132151   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.133038   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:42.129369   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.130125   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.131195   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.132151   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:42.133038   13235 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:44.639103   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:44.648984   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:44.649044   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:44.673076   54807 cri.go:89] found id: ""
	I1202 19:15:44.673091   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.673098   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:44.673105   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:44.673162   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:44.696488   54807 cri.go:89] found id: ""
	I1202 19:15:44.696501   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.696507   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:44.696512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:44.696568   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:44.722164   54807 cri.go:89] found id: ""
	I1202 19:15:44.722177   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.722184   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:44.722190   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:44.722254   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:44.745410   54807 cri.go:89] found id: ""
	I1202 19:15:44.745424   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.745431   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:44.745437   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:44.745494   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:44.769317   54807 cri.go:89] found id: ""
	I1202 19:15:44.769330   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.769337   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:44.769342   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:44.769404   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:44.794282   54807 cri.go:89] found id: ""
	I1202 19:15:44.794295   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.794302   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:44.794308   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:44.794369   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:44.818676   54807 cri.go:89] found id: ""
	I1202 19:15:44.818689   54807 logs.go:282] 0 containers: []
	W1202 19:15:44.818696   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:44.818703   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:44.818734   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:44.829491   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:44.829506   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:44.892401   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:44.884881   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.885617   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887285   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887590   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.889113   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:44.884881   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.885617   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887285   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.887590   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:44.889113   13319 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:44.892427   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:44.892438   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:44.961436   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:44.961457   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:45.004301   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:45.004340   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:47.597359   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:47.607380   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:47.607436   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:47.632361   54807 cri.go:89] found id: ""
	I1202 19:15:47.632375   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.632382   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:47.632387   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:47.632443   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:47.657478   54807 cri.go:89] found id: ""
	I1202 19:15:47.657491   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.657498   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:47.657504   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:47.657565   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:47.681973   54807 cri.go:89] found id: ""
	I1202 19:15:47.681987   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.681994   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:47.681999   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:47.682054   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:47.705968   54807 cri.go:89] found id: ""
	I1202 19:15:47.705982   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.705988   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:47.705994   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:47.706051   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:47.730910   54807 cri.go:89] found id: ""
	I1202 19:15:47.730923   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.730930   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:47.730935   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:47.730992   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:47.757739   54807 cri.go:89] found id: ""
	I1202 19:15:47.757752   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.757759   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:47.757764   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:47.757820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:47.782566   54807 cri.go:89] found id: ""
	I1202 19:15:47.782579   54807 logs.go:282] 0 containers: []
	W1202 19:15:47.782586   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:47.782594   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:47.782605   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:47.845974   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:47.837753   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.838402   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840192   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840777   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.842546   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:47.837753   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.838402   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840192   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.840777   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:47.842546   13419 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:47.845983   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:47.845994   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:47.913035   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:47.913054   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:47.952076   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:47.952091   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:48.023577   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:48.023596   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:50.534902   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:50.544843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:50.544904   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:50.573435   54807 cri.go:89] found id: ""
	I1202 19:15:50.573449   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.573456   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:50.573462   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:50.573524   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:50.598029   54807 cri.go:89] found id: ""
	I1202 19:15:50.598043   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.598051   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:50.598056   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:50.598115   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:50.623452   54807 cri.go:89] found id: ""
	I1202 19:15:50.623465   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.623472   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:50.623478   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:50.623536   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:50.648357   54807 cri.go:89] found id: ""
	I1202 19:15:50.648371   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.648378   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:50.648383   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:50.648441   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:50.672042   54807 cri.go:89] found id: ""
	I1202 19:15:50.672056   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.672063   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:50.672068   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:50.672125   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:50.697434   54807 cri.go:89] found id: ""
	I1202 19:15:50.697448   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.697455   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:50.697461   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:50.697525   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:50.728291   54807 cri.go:89] found id: ""
	I1202 19:15:50.728305   54807 logs.go:282] 0 containers: []
	W1202 19:15:50.728312   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:50.728340   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:50.728351   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:50.790193   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:50.781764   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.782494   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784122   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784535   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.786186   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:50.781764   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.782494   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784122   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.784535   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:50.786186   13524 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:50.790203   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:50.790214   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:50.855933   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:50.855951   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:50.884682   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:50.884698   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:50.949404   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:50.949423   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:53.461440   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:53.471831   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:53.471906   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:53.496591   54807 cri.go:89] found id: ""
	I1202 19:15:53.496604   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.496611   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:53.496617   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:53.496674   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:53.521087   54807 cri.go:89] found id: ""
	I1202 19:15:53.521103   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.521111   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:53.521116   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:53.521174   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:53.545148   54807 cri.go:89] found id: ""
	I1202 19:15:53.545161   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.545168   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:53.545173   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:53.545231   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:53.570884   54807 cri.go:89] found id: ""
	I1202 19:15:53.570898   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.570904   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:53.570910   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:53.570972   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:53.597220   54807 cri.go:89] found id: ""
	I1202 19:15:53.597234   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.597241   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:53.597247   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:53.597326   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:53.626817   54807 cri.go:89] found id: ""
	I1202 19:15:53.626830   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.626837   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:53.626843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:53.626901   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:53.656721   54807 cri.go:89] found id: ""
	I1202 19:15:53.656734   54807 logs.go:282] 0 containers: []
	W1202 19:15:53.656741   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:53.656750   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:53.656762   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:53.721841   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:53.712824   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.713681   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715369   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715912   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.717503   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:53.712824   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.713681   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715369   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.715912   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:53.717503   13626 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:53.721850   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:53.721862   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:53.785783   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:53.785801   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:53.815658   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:53.815673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:53.873221   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:53.873238   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:56.384447   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:56.394843   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:56.394909   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:56.425129   54807 cri.go:89] found id: ""
	I1202 19:15:56.425142   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.425149   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:56.425154   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:56.425212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:56.451236   54807 cri.go:89] found id: ""
	I1202 19:15:56.451250   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.451257   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:56.451263   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:56.451327   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:56.476585   54807 cri.go:89] found id: ""
	I1202 19:15:56.476599   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.476606   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:56.476611   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:56.476669   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:56.501814   54807 cri.go:89] found id: ""
	I1202 19:15:56.501828   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.501834   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:56.501840   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:56.501900   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:56.530866   54807 cri.go:89] found id: ""
	I1202 19:15:56.530879   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.530886   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:56.530891   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:56.530959   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:56.555014   54807 cri.go:89] found id: ""
	I1202 19:15:56.555029   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.555036   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:56.555042   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:56.555102   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:56.582644   54807 cri.go:89] found id: ""
	I1202 19:15:56.582657   54807 logs.go:282] 0 containers: []
	W1202 19:15:56.582664   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:56.582672   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:56.582684   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:56.637937   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:56.637955   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:15:56.648656   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:56.648672   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:56.716929   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:56.708385   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.708981   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.710802   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.711377   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.712990   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:56.708385   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.708981   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.710802   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.711377   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:56.712990   13741 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:56.716939   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:56.716950   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:56.783854   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:56.783880   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:59.312498   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:15:59.322671   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:15:59.322730   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:15:59.346425   54807 cri.go:89] found id: ""
	I1202 19:15:59.346439   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.346446   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:15:59.346452   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:15:59.346515   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:15:59.371199   54807 cri.go:89] found id: ""
	I1202 19:15:59.371212   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.371219   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:15:59.371224   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:15:59.371286   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:15:59.398444   54807 cri.go:89] found id: ""
	I1202 19:15:59.398458   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.398465   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:15:59.398470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:15:59.398528   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:15:59.423109   54807 cri.go:89] found id: ""
	I1202 19:15:59.423122   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.423129   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:15:59.423135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:15:59.423193   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:15:59.448440   54807 cri.go:89] found id: ""
	I1202 19:15:59.448454   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.448461   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:15:59.448469   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:15:59.448539   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:15:59.472288   54807 cri.go:89] found id: ""
	I1202 19:15:59.472302   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.472309   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:15:59.472315   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:15:59.472396   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:15:59.501959   54807 cri.go:89] found id: ""
	I1202 19:15:59.501973   54807 logs.go:282] 0 containers: []
	W1202 19:15:59.501980   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:15:59.501987   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:15:59.501999   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:15:59.562783   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:15:59.555099   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.555718   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557259   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557814   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.559465   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:15:59.555099   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.555718   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557259   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.557814   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:15:59.559465   13842 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:15:59.562800   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:15:59.562811   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:15:59.626612   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:15:59.626631   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:15:59.655068   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:15:59.655083   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:15:59.713332   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:15:59.713350   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:02.224451   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:02.234704   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:02.234765   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:02.260861   54807 cri.go:89] found id: ""
	I1202 19:16:02.260875   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.260882   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:02.260888   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:02.260951   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:02.286334   54807 cri.go:89] found id: ""
	I1202 19:16:02.286354   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.286362   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:02.286367   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:02.286426   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:02.310961   54807 cri.go:89] found id: ""
	I1202 19:16:02.310975   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.310982   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:02.310988   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:02.311050   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:02.339645   54807 cri.go:89] found id: ""
	I1202 19:16:02.339658   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.339665   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:02.339670   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:02.339727   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:02.364456   54807 cri.go:89] found id: ""
	I1202 19:16:02.364471   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.364478   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:02.364484   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:02.364547   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:02.394258   54807 cri.go:89] found id: ""
	I1202 19:16:02.394272   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.394278   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:02.394284   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:02.394342   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:02.418723   54807 cri.go:89] found id: ""
	I1202 19:16:02.418737   54807 logs.go:282] 0 containers: []
	W1202 19:16:02.418744   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:02.418752   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:02.418762   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:02.482679   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:02.474517   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.475378   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.476933   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.477486   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.479002   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:02.474517   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.475378   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.476933   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.477486   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:02.479002   13947 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:02.482690   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:02.482700   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:02.548276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:02.548295   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:02.578369   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:02.578386   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:02.636563   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:02.636581   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:05.147857   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:05.158273   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:05.158332   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:05.198133   54807 cri.go:89] found id: ""
	I1202 19:16:05.198149   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.198161   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:05.198167   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:05.198230   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:05.229481   54807 cri.go:89] found id: ""
	I1202 19:16:05.229494   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.229508   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:05.229513   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:05.229573   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:05.255940   54807 cri.go:89] found id: ""
	I1202 19:16:05.255954   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.255961   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:05.255967   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:05.256027   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:05.281978   54807 cri.go:89] found id: ""
	I1202 19:16:05.281991   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.281998   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:05.282004   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:05.282063   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:05.310511   54807 cri.go:89] found id: ""
	I1202 19:16:05.310525   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.310533   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:05.310539   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:05.310605   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:05.340114   54807 cri.go:89] found id: ""
	I1202 19:16:05.340127   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.340135   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:05.340140   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:05.340198   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:05.366243   54807 cri.go:89] found id: ""
	I1202 19:16:05.366256   54807 logs.go:282] 0 containers: []
	W1202 19:16:05.366263   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:05.366271   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:05.366283   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:05.393993   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:05.394009   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:05.450279   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:05.450299   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:05.461585   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:05.461602   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:05.528601   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:05.520245   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.521019   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.522739   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.523260   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.524914   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:05.520245   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.521019   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.522739   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.523260   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:05.524914   14070 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:05.528610   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:05.528621   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:08.097252   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:08.107731   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:08.107792   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:08.134215   54807 cri.go:89] found id: ""
	I1202 19:16:08.134240   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.134248   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:08.134255   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:08.134327   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:08.160174   54807 cri.go:89] found id: ""
	I1202 19:16:08.160188   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.160195   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:08.160200   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:08.160259   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:08.188835   54807 cri.go:89] found id: ""
	I1202 19:16:08.188849   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.188856   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:08.188871   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:08.188930   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:08.222672   54807 cri.go:89] found id: ""
	I1202 19:16:08.222686   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.222703   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:08.222710   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:08.222774   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:08.252685   54807 cri.go:89] found id: ""
	I1202 19:16:08.252699   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.252705   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:08.252711   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:08.252767   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:08.281659   54807 cri.go:89] found id: ""
	I1202 19:16:08.281672   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.281679   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:08.281685   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:08.281757   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:08.306909   54807 cri.go:89] found id: ""
	I1202 19:16:08.306922   54807 logs.go:282] 0 containers: []
	W1202 19:16:08.306929   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:08.306936   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:08.306947   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:08.363919   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:08.363938   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:08.375138   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:08.375154   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:08.443392   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:08.435786   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.436517   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438269   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438769   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.439921   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:08.435786   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.436517   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438269   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.438769   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:08.439921   14165 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:08.443414   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:08.443428   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:08.507474   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:08.507492   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:11.037665   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:11.050056   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:11.050130   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:11.076993   54807 cri.go:89] found id: ""
	I1202 19:16:11.077008   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.077015   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:11.077021   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:11.077088   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:11.104370   54807 cri.go:89] found id: ""
	I1202 19:16:11.104384   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.104393   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:11.104399   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:11.104463   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:11.132145   54807 cri.go:89] found id: ""
	I1202 19:16:11.132160   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.132168   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:11.132174   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:11.132235   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:11.158847   54807 cri.go:89] found id: ""
	I1202 19:16:11.158861   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.158868   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:11.158874   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:11.158934   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:11.198715   54807 cri.go:89] found id: ""
	I1202 19:16:11.198729   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.198736   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:11.198741   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:11.198804   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:11.230867   54807 cri.go:89] found id: ""
	I1202 19:16:11.230886   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.230893   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:11.230899   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:11.230957   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:11.259807   54807 cri.go:89] found id: ""
	I1202 19:16:11.259821   54807 logs.go:282] 0 containers: []
	W1202 19:16:11.259828   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:11.259836   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:11.259846   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:11.287151   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:11.287167   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:11.344009   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:11.344032   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:11.354412   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:11.354433   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:11.420896   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:11.412603   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.413632   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415146   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415437   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.416861   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:11.412603   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.413632   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415146   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.415437   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:11.416861   14280 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:11.420906   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:11.420917   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:13.984421   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:13.995238   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:13.995302   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:14.021325   54807 cri.go:89] found id: ""
	I1202 19:16:14.021338   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.021345   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:14.021350   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:14.021407   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:14.047264   54807 cri.go:89] found id: ""
	I1202 19:16:14.047278   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.047285   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:14.047291   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:14.047355   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:14.071231   54807 cri.go:89] found id: ""
	I1202 19:16:14.071245   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.071252   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:14.071257   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:14.071315   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:14.096289   54807 cri.go:89] found id: ""
	I1202 19:16:14.096302   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.096309   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:14.096315   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:14.096397   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:14.122522   54807 cri.go:89] found id: ""
	I1202 19:16:14.122535   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.122542   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:14.122548   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:14.122608   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:14.151408   54807 cri.go:89] found id: ""
	I1202 19:16:14.151422   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.151429   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:14.151435   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:14.151496   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:14.182327   54807 cri.go:89] found id: ""
	I1202 19:16:14.182340   54807 logs.go:282] 0 containers: []
	W1202 19:16:14.182347   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:14.182355   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:14.182365   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:14.246777   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:14.246796   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:14.262093   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:14.262108   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:14.326058   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:14.317581   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.318458   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320176   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320802   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.322292   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:14.317581   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.318458   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320176   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.320802   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:14.322292   14371 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:14.326068   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:14.326080   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:14.388559   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:14.388578   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:16.920108   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:16.930319   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:16.930382   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:16.955799   54807 cri.go:89] found id: ""
	I1202 19:16:16.955813   54807 logs.go:282] 0 containers: []
	W1202 19:16:16.955820   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:16.955825   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:16.955882   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:16.982139   54807 cri.go:89] found id: ""
	I1202 19:16:16.982153   54807 logs.go:282] 0 containers: []
	W1202 19:16:16.982160   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:16.982165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:16.982223   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:17.007837   54807 cri.go:89] found id: ""
	I1202 19:16:17.007851   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.007857   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:17.007863   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:17.007933   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:17.034216   54807 cri.go:89] found id: ""
	I1202 19:16:17.034229   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.034236   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:17.034241   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:17.034298   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:17.063913   54807 cri.go:89] found id: ""
	I1202 19:16:17.063927   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.063934   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:17.063939   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:17.063997   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:17.088826   54807 cri.go:89] found id: ""
	I1202 19:16:17.088840   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.088847   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:17.088853   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:17.088913   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:17.114356   54807 cri.go:89] found id: ""
	I1202 19:16:17.114370   54807 logs.go:282] 0 containers: []
	W1202 19:16:17.114376   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:17.114384   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:17.114394   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:17.171571   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:17.171591   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:17.192662   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:17.192677   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:17.265860   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:17.257716   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.258547   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260144   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260824   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.262474   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:17.257716   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.258547   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260144   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.260824   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:17.262474   14475 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:17.265870   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:17.265883   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:17.329636   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:17.329654   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:19.857139   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:19.867414   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:19.867471   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:19.891736   54807 cri.go:89] found id: ""
	I1202 19:16:19.891750   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.891757   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:19.891762   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:19.891819   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:19.916840   54807 cri.go:89] found id: ""
	I1202 19:16:19.916854   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.916861   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:19.916881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:19.916938   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:19.941623   54807 cri.go:89] found id: ""
	I1202 19:16:19.941636   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.941643   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:19.941649   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:19.941706   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:19.973037   54807 cri.go:89] found id: ""
	I1202 19:16:19.973051   54807 logs.go:282] 0 containers: []
	W1202 19:16:19.973059   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:19.973065   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:19.973134   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:20.000748   54807 cri.go:89] found id: ""
	I1202 19:16:20.000765   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.000773   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:20.000780   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:20.000851   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:20.025854   54807 cri.go:89] found id: ""
	I1202 19:16:20.025868   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.025875   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:20.025881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:20.025940   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:20.052281   54807 cri.go:89] found id: ""
	I1202 19:16:20.052296   54807 logs.go:282] 0 containers: []
	W1202 19:16:20.052304   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:20.052312   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:20.052346   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:20.120511   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:20.111945   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.112719   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.114519   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.115208   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.116979   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:20.111945   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.112719   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.114519   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.115208   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:20.116979   14568 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:20.120542   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:20.120557   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:20.192068   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:20.192088   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:20.232059   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:20.232074   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:20.287505   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:20.287527   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:22.798885   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:22.808880   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:22.808947   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:22.838711   54807 cri.go:89] found id: ""
	I1202 19:16:22.838736   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.838744   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:22.838750   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:22.838815   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:22.866166   54807 cri.go:89] found id: ""
	I1202 19:16:22.866180   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.866187   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:22.866192   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:22.866250   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:22.890456   54807 cri.go:89] found id: ""
	I1202 19:16:22.890470   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.890484   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:22.890490   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:22.890554   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:22.915548   54807 cri.go:89] found id: ""
	I1202 19:16:22.915562   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.915578   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:22.915585   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:22.915643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:22.940011   54807 cri.go:89] found id: ""
	I1202 19:16:22.940025   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.940032   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:22.940037   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:22.940093   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:22.965647   54807 cri.go:89] found id: ""
	I1202 19:16:22.965660   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.965670   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:22.965677   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:22.965744   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:22.994566   54807 cri.go:89] found id: ""
	I1202 19:16:22.994580   54807 logs.go:282] 0 containers: []
	W1202 19:16:22.994587   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:22.994595   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:22.994611   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:23.050953   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:23.050973   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:23.061610   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:23.061624   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:23.127525   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:23.119520   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.120161   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.121888   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.122530   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.124179   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:23.119520   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.120161   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.121888   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.122530   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:23.124179   14680 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:23.127534   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:23.127546   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:23.194603   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:23.194639   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:25.725656   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:25.735521   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:25.735580   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:25.760626   54807 cri.go:89] found id: ""
	I1202 19:16:25.760640   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.760647   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:25.760652   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:25.760711   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:25.786443   54807 cri.go:89] found id: ""
	I1202 19:16:25.786457   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.786464   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:25.786470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:25.786529   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:25.813975   54807 cri.go:89] found id: ""
	I1202 19:16:25.813989   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.813996   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:25.814001   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:25.814059   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:25.839899   54807 cri.go:89] found id: ""
	I1202 19:16:25.839912   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.839920   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:25.839925   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:25.839983   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:25.869299   54807 cri.go:89] found id: ""
	I1202 19:16:25.869312   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.869319   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:25.869325   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:25.869384   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:25.894364   54807 cri.go:89] found id: ""
	I1202 19:16:25.894379   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.894385   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:25.894391   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:25.894448   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:25.919717   54807 cri.go:89] found id: ""
	I1202 19:16:25.919733   54807 logs.go:282] 0 containers: []
	W1202 19:16:25.919741   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:25.919748   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:25.919759   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:25.988177   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:25.979290   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.979818   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.981491   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.982030   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.983829   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:25.979290   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.979818   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.981491   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.982030   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:25.983829   14781 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:25.988188   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:25.988198   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:26.052787   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:26.052806   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:26.081027   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:26.081042   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:26.138061   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:26.138079   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:28.650000   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:28.660481   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:28.660541   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:28.685594   54807 cri.go:89] found id: ""
	I1202 19:16:28.685608   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.685616   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:28.685621   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:28.685679   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:28.710399   54807 cri.go:89] found id: ""
	I1202 19:16:28.710412   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.710419   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:28.710425   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:28.710481   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:28.735520   54807 cri.go:89] found id: ""
	I1202 19:16:28.735533   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.735546   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:28.735551   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:28.735607   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:28.762423   54807 cri.go:89] found id: ""
	I1202 19:16:28.762436   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.762443   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:28.762449   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:28.762515   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:28.791746   54807 cri.go:89] found id: ""
	I1202 19:16:28.791760   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.791767   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:28.791772   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:28.791831   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:28.818359   54807 cri.go:89] found id: ""
	I1202 19:16:28.818372   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.818379   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:28.818386   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:28.818443   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:28.846465   54807 cri.go:89] found id: ""
	I1202 19:16:28.846479   54807 logs.go:282] 0 containers: []
	W1202 19:16:28.846486   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:28.846494   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:28.846503   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:28.903412   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:28.903430   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:28.914210   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:28.914267   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:28.978428   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:28.970543   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.971044   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.972803   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.973286   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.974839   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:28.970543   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.971044   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.972803   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.973286   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:28.974839   14894 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:28.978439   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:28.978450   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:29.041343   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:29.041363   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:31.570595   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:31.583500   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:31.583573   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:31.611783   54807 cri.go:89] found id: ""
	I1202 19:16:31.611796   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.611805   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:31.611811   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:31.611868   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:31.639061   54807 cri.go:89] found id: ""
	I1202 19:16:31.639074   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.639081   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:31.639086   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:31.639152   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:31.664706   54807 cri.go:89] found id: ""
	I1202 19:16:31.664719   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.664726   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:31.664732   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:31.664789   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:31.688725   54807 cri.go:89] found id: ""
	I1202 19:16:31.688739   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.688746   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:31.688751   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:31.688807   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:31.713308   54807 cri.go:89] found id: ""
	I1202 19:16:31.713321   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.713328   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:31.713333   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:31.713391   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:31.737960   54807 cri.go:89] found id: ""
	I1202 19:16:31.737973   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.737980   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:31.737985   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:31.738041   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:31.766035   54807 cri.go:89] found id: ""
	I1202 19:16:31.766048   54807 logs.go:282] 0 containers: []
	W1202 19:16:31.766055   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:31.766063   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:31.766078   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:31.821307   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:31.821327   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:31.832103   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:31.832118   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:31.894804   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:31.886409   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.887232   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.888852   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.889449   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.891143   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:31.886409   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.887232   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.888852   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.889449   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:31.891143   14997 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:31.894814   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:31.894824   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:31.958623   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:31.958641   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:34.494532   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:34.504804   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:34.504861   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:34.534339   54807 cri.go:89] found id: ""
	I1202 19:16:34.534359   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.534366   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:34.534372   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:34.534430   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:34.559181   54807 cri.go:89] found id: ""
	I1202 19:16:34.559194   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.559203   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:34.559208   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:34.559266   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:34.583120   54807 cri.go:89] found id: ""
	I1202 19:16:34.583133   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.583139   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:34.583145   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:34.583245   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:34.608256   54807 cri.go:89] found id: ""
	I1202 19:16:34.608269   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.608276   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:34.608282   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:34.608365   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:34.632733   54807 cri.go:89] found id: ""
	I1202 19:16:34.632747   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.632754   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:34.632759   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:34.632821   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:34.663293   54807 cri.go:89] found id: ""
	I1202 19:16:34.663307   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.663314   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:34.663320   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:34.663376   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:34.686842   54807 cri.go:89] found id: ""
	I1202 19:16:34.686856   54807 logs.go:282] 0 containers: []
	W1202 19:16:34.686863   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:34.686871   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:34.686881   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:34.697549   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:34.697564   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:34.764406   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:34.756417   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.757141   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.758783   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.759285   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.760837   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:34.756417   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.757141   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.758783   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.759285   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:34.760837   15099 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:34.764416   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:34.764427   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:34.827201   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:34.827223   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:34.854552   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:34.854570   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:37.413003   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:37.423382   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:37.423441   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:37.459973   54807 cri.go:89] found id: ""
	I1202 19:16:37.459987   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.459994   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:37.460000   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:37.460062   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:37.494488   54807 cri.go:89] found id: ""
	I1202 19:16:37.494503   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.494510   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:37.494515   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:37.494584   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:37.519270   54807 cri.go:89] found id: ""
	I1202 19:16:37.519283   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.519290   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:37.519295   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:37.519351   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:37.545987   54807 cri.go:89] found id: ""
	I1202 19:16:37.546001   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.546008   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:37.546013   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:37.546069   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:37.574348   54807 cri.go:89] found id: ""
	I1202 19:16:37.574362   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.574369   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:37.574375   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:37.574437   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:37.600075   54807 cri.go:89] found id: ""
	I1202 19:16:37.600089   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.600096   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:37.600102   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:37.600167   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:37.625421   54807 cri.go:89] found id: ""
	I1202 19:16:37.625434   54807 logs.go:282] 0 containers: []
	W1202 19:16:37.625443   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:37.625450   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:37.625460   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:37.688980   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:37.689000   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:37.719329   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:37.719344   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:37.778206   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:37.778225   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:37.789133   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:37.789148   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:37.856498   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:37.848835   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.849672   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851302   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851842   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.853113   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:37.848835   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.849672   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851302   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.851842   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:37.853113   15218 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:40.358183   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:40.368449   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:40.368509   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:40.392705   54807 cri.go:89] found id: ""
	I1202 19:16:40.392721   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.392728   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:40.392734   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:40.392796   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:40.417408   54807 cri.go:89] found id: ""
	I1202 19:16:40.417422   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.417429   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:40.417435   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:40.417493   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:40.458012   54807 cri.go:89] found id: ""
	I1202 19:16:40.458026   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.458033   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:40.458039   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:40.458094   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:40.498315   54807 cri.go:89] found id: ""
	I1202 19:16:40.498328   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.498335   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:40.498341   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:40.498402   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:40.523770   54807 cri.go:89] found id: ""
	I1202 19:16:40.523784   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.523792   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:40.523797   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:40.523865   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:40.549124   54807 cri.go:89] found id: ""
	I1202 19:16:40.549137   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.549144   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:40.549149   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:40.549207   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:40.573667   54807 cri.go:89] found id: ""
	I1202 19:16:40.573680   54807 logs.go:282] 0 containers: []
	W1202 19:16:40.573688   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:40.573696   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:40.573708   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:40.629671   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:40.629688   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:40.640745   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:40.640760   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:40.706165   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:40.697573   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.698405   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700020   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700368   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.702012   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:40.697573   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.698405   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700020   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.700368   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:40.702012   15312 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:40.706175   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:40.706186   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:40.775737   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:40.775755   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:43.307135   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:43.317487   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:43.317553   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:43.342709   54807 cri.go:89] found id: ""
	I1202 19:16:43.342722   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.342730   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:43.342735   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:43.342793   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:43.367380   54807 cri.go:89] found id: ""
	I1202 19:16:43.367393   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.367400   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:43.367406   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:43.367462   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:43.394678   54807 cri.go:89] found id: ""
	I1202 19:16:43.394691   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.394699   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:43.394704   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:43.394761   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:43.421130   54807 cri.go:89] found id: ""
	I1202 19:16:43.421144   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.421151   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:43.421156   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:43.421212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:43.454728   54807 cri.go:89] found id: ""
	I1202 19:16:43.454741   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.454749   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:43.454754   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:43.454810   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:43.491457   54807 cri.go:89] found id: ""
	I1202 19:16:43.491470   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.491477   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:43.491482   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:43.491537   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:43.515943   54807 cri.go:89] found id: ""
	I1202 19:16:43.515957   54807 logs.go:282] 0 containers: []
	W1202 19:16:43.515964   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:43.515972   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:43.515982   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:43.579953   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:43.579972   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:43.608617   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:43.608632   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:43.666586   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:43.666604   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:43.677358   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:43.677374   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:43.741646   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:43.733470   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.734235   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.735923   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.736656   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.738348   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:43.733470   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.734235   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.735923   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.736656   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:43.738348   15433 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:46.243365   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:46.255599   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:46.255658   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:46.280357   54807 cri.go:89] found id: ""
	I1202 19:16:46.280369   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.280376   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:46.280382   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:46.280444   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:46.304610   54807 cri.go:89] found id: ""
	I1202 19:16:46.304623   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.304630   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:46.304635   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:46.304692   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:46.328944   54807 cri.go:89] found id: ""
	I1202 19:16:46.328957   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.328963   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:46.328968   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:46.329027   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:46.357896   54807 cri.go:89] found id: ""
	I1202 19:16:46.357909   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.357916   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:46.357923   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:46.357981   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:46.381601   54807 cri.go:89] found id: ""
	I1202 19:16:46.381613   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.381620   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:46.381626   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:46.381687   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:46.406928   54807 cri.go:89] found id: ""
	I1202 19:16:46.406942   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.406949   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:46.406954   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:46.407009   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:46.449373   54807 cri.go:89] found id: ""
	I1202 19:16:46.449386   54807 logs.go:282] 0 containers: []
	W1202 19:16:46.449393   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:46.449401   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:46.449411   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:46.516162   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:46.516180   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:46.527166   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:46.527183   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:46.590201   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:46.582136   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.583309   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.584090   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585115   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585711   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:46.582136   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.583309   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.584090   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585115   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:46.585711   15525 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:46.590211   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:46.590221   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:46.652574   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:46.652593   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:49.180131   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:49.190665   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:49.190729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:49.215295   54807 cri.go:89] found id: ""
	I1202 19:16:49.215308   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.215315   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:49.215321   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:49.215382   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:49.241898   54807 cri.go:89] found id: ""
	I1202 19:16:49.241912   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.241919   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:49.241925   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:49.241986   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:49.266638   54807 cri.go:89] found id: ""
	I1202 19:16:49.266651   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.266658   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:49.266664   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:49.266719   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:49.292478   54807 cri.go:89] found id: ""
	I1202 19:16:49.292496   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.292506   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:49.292512   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:49.292589   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:49.318280   54807 cri.go:89] found id: ""
	I1202 19:16:49.318293   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.318300   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:49.318306   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:49.318373   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:49.351760   54807 cri.go:89] found id: ""
	I1202 19:16:49.351774   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.351787   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:49.351793   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:49.351854   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:49.376513   54807 cri.go:89] found id: ""
	I1202 19:16:49.376536   54807 logs.go:282] 0 containers: []
	W1202 19:16:49.376543   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:49.376551   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:49.376563   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:49.448960   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:49.448987   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:49.482655   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:49.482673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:49.541305   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:49.541322   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:49.552971   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:49.552988   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:49.618105   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:49.609636   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.610374   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612148   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612903   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.614517   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:49.609636   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.610374   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612148   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.612903   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:49.614517   15641 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:52.119791   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:52.130607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:52.130669   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:52.155643   54807 cri.go:89] found id: ""
	I1202 19:16:52.155656   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.155663   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:52.155669   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:52.155729   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:52.179230   54807 cri.go:89] found id: ""
	I1202 19:16:52.179244   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.179253   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:52.179259   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:52.179316   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:52.203772   54807 cri.go:89] found id: ""
	I1202 19:16:52.203785   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.203792   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:52.203798   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:52.203852   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:52.236168   54807 cri.go:89] found id: ""
	I1202 19:16:52.236183   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.236190   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:52.236196   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:52.236257   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:52.260979   54807 cri.go:89] found id: ""
	I1202 19:16:52.260995   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.261003   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:52.261008   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:52.261063   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:52.284287   54807 cri.go:89] found id: ""
	I1202 19:16:52.284299   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.284306   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:52.284312   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:52.284385   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:52.310376   54807 cri.go:89] found id: ""
	I1202 19:16:52.310390   54807 logs.go:282] 0 containers: []
	W1202 19:16:52.310397   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:52.310405   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:52.310415   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:52.366619   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:52.366636   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:52.377556   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:52.377572   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:52.453208   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:52.444029   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.444937   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.446866   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.447608   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.449367   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:52.444029   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.444937   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.446866   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.447608   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:52.449367   15727 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:52.453218   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:52.453229   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:52.524196   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:52.524214   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:55.052717   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:55.063878   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:55.063943   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:55.089568   54807 cri.go:89] found id: ""
	I1202 19:16:55.089582   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.089588   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:55.089594   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:55.089658   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:55.116741   54807 cri.go:89] found id: ""
	I1202 19:16:55.116755   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.116762   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:55.116768   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:55.116825   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:55.142748   54807 cri.go:89] found id: ""
	I1202 19:16:55.142761   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.142768   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:55.142774   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:55.142836   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:55.167341   54807 cri.go:89] found id: ""
	I1202 19:16:55.167354   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.167361   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:55.167367   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:55.167424   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:55.194118   54807 cri.go:89] found id: ""
	I1202 19:16:55.194132   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.194139   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:55.194144   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:55.194201   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:55.218379   54807 cri.go:89] found id: ""
	I1202 19:16:55.218393   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.218400   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:55.218406   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:55.218465   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:55.243035   54807 cri.go:89] found id: ""
	I1202 19:16:55.243048   54807 logs.go:282] 0 containers: []
	W1202 19:16:55.243055   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:55.243063   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:55.243073   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:55.310493   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:55.302627   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.303246   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.304790   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.305303   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.306777   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:55.302627   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.303246   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.304790   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.305303   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:55.306777   15827 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:55.310504   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:55.310517   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:55.373914   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:55.373933   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:16:55.405157   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:55.405172   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:55.473565   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:55.473583   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:57.986363   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:16:57.996902   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:16:57.996969   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:16:58.023028   54807 cri.go:89] found id: ""
	I1202 19:16:58.023042   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.023049   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:16:58.023055   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:16:58.023113   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:16:58.049927   54807 cri.go:89] found id: ""
	I1202 19:16:58.049941   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.049947   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:16:58.049953   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:16:58.050013   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:16:58.078428   54807 cri.go:89] found id: ""
	I1202 19:16:58.078448   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.078456   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:16:58.078461   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:16:58.078516   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:16:58.105365   54807 cri.go:89] found id: ""
	I1202 19:16:58.105377   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.105385   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:16:58.105390   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:16:58.105448   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:16:58.129444   54807 cri.go:89] found id: ""
	I1202 19:16:58.129458   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.129465   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:16:58.129470   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:16:58.129531   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:16:58.157574   54807 cri.go:89] found id: ""
	I1202 19:16:58.157588   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.157594   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:16:58.157607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:16:58.157670   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:16:58.182028   54807 cri.go:89] found id: ""
	I1202 19:16:58.182042   54807 logs.go:282] 0 containers: []
	W1202 19:16:58.182049   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:16:58.182057   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:16:58.182067   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:16:58.241166   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:16:58.241184   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:16:58.252367   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:16:58.252383   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:16:58.319914   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:16:58.311929   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.312824   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.314498   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.315073   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.316444   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:16:58.311929   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.312824   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.314498   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.315073   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:16:58.316444   15940 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:16:58.319925   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:16:58.319937   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:16:58.381228   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:16:58.381246   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:00.909644   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:00.920924   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:00.921037   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:00.947793   54807 cri.go:89] found id: ""
	I1202 19:17:00.947812   54807 logs.go:282] 0 containers: []
	W1202 19:17:00.947820   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:00.947828   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:00.947900   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:00.975539   54807 cri.go:89] found id: ""
	I1202 19:17:00.975553   54807 logs.go:282] 0 containers: []
	W1202 19:17:00.975561   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:00.975566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:00.975629   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:01.002532   54807 cri.go:89] found id: ""
	I1202 19:17:01.002549   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.002560   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:01.002566   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:01.002636   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:01.032211   54807 cri.go:89] found id: ""
	I1202 19:17:01.032226   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.032233   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:01.032239   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:01.032302   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:01.059398   54807 cri.go:89] found id: ""
	I1202 19:17:01.059413   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.059420   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:01.059426   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:01.059486   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:01.091722   54807 cri.go:89] found id: ""
	I1202 19:17:01.091740   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.091746   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:01.091752   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:01.091816   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:01.117849   54807 cri.go:89] found id: ""
	I1202 19:17:01.117864   54807 logs.go:282] 0 containers: []
	W1202 19:17:01.117871   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:01.117879   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:01.117893   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:01.191972   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:01.182202   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.183119   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.185191   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.186030   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.187874   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:01.182202   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.183119   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.185191   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.186030   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:01.187874   16040 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:01.191984   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:01.191997   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:01.260783   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:01.260806   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:01.290665   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:01.290683   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:01.348633   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:01.348653   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:03.860845   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:03.871899   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:03.871966   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:03.899158   54807 cri.go:89] found id: ""
	I1202 19:17:03.899172   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.899179   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:03.899185   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:03.899244   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:03.925147   54807 cri.go:89] found id: ""
	I1202 19:17:03.925161   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.925168   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:03.925174   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:03.925235   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:03.955130   54807 cri.go:89] found id: ""
	I1202 19:17:03.955143   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.955150   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:03.955156   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:03.955215   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:03.983272   54807 cri.go:89] found id: ""
	I1202 19:17:03.983286   54807 logs.go:282] 0 containers: []
	W1202 19:17:03.983294   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:03.983300   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:03.983371   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:04.009435   54807 cri.go:89] found id: ""
	I1202 19:17:04.009449   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.009456   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:04.009463   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:04.009523   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:04.037346   54807 cri.go:89] found id: ""
	I1202 19:17:04.037360   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.037368   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:04.037374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:04.037433   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:04.066662   54807 cri.go:89] found id: ""
	I1202 19:17:04.066675   54807 logs.go:282] 0 containers: []
	W1202 19:17:04.066682   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:04.066690   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:04.066701   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:04.125350   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:04.125369   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:04.136698   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:04.136716   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:04.206327   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:04.198119   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.198761   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.200411   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.201024   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.202642   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:04.198119   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.198761   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.200411   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.201024   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:04.202642   16153 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:04.206338   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:04.206353   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:04.274588   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:04.274608   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:06.806010   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:06.817189   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:06.817256   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:06.843114   54807 cri.go:89] found id: ""
	I1202 19:17:06.843129   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.843136   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:06.843142   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:06.843218   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:06.873921   54807 cri.go:89] found id: ""
	I1202 19:17:06.873947   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.873955   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:06.873961   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:06.874045   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:06.900636   54807 cri.go:89] found id: ""
	I1202 19:17:06.900651   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.900658   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:06.900664   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:06.900724   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:06.928484   54807 cri.go:89] found id: ""
	I1202 19:17:06.928504   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.928512   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:06.928518   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:06.928583   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:06.956137   54807 cri.go:89] found id: ""
	I1202 19:17:06.956170   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.956179   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:06.956184   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:06.956258   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:06.987383   54807 cri.go:89] found id: ""
	I1202 19:17:06.987408   54807 logs.go:282] 0 containers: []
	W1202 19:17:06.987416   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:06.987422   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:06.987495   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:07.013712   54807 cri.go:89] found id: ""
	I1202 19:17:07.013726   54807 logs.go:282] 0 containers: []
	W1202 19:17:07.013733   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:07.013741   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:07.013756   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:07.076937   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:07.076955   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:07.106847   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:07.106863   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:07.164565   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:07.164584   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:07.177132   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:07.177154   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:07.245572   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:07.237375   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.238081   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.239663   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.240205   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.241966   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:07.237375   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.238081   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.239663   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.240205   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:07.241966   16272 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:09.745822   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:09.756122   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:09.756180   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:09.784649   54807 cri.go:89] found id: ""
	I1202 19:17:09.784663   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.784670   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:09.784675   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:09.784732   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:09.809632   54807 cri.go:89] found id: ""
	I1202 19:17:09.809655   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.809662   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:09.809668   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:09.809733   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:09.839403   54807 cri.go:89] found id: ""
	I1202 19:17:09.839425   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.839433   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:09.839439   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:09.839504   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:09.868977   54807 cri.go:89] found id: ""
	I1202 19:17:09.868991   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.868999   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:09.869004   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:09.869064   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:09.894156   54807 cri.go:89] found id: ""
	I1202 19:17:09.894170   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.894176   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:09.894182   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:09.894237   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:09.919174   54807 cri.go:89] found id: ""
	I1202 19:17:09.919188   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.919195   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:09.919200   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:09.919261   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:09.944620   54807 cri.go:89] found id: ""
	I1202 19:17:09.944632   54807 logs.go:282] 0 containers: []
	W1202 19:17:09.944639   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:09.944647   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:09.944657   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:10.004028   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:10.004049   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:10.015962   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:10.015979   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:10.086133   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:10.078544   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.079196   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.080924   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.081452   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.082613   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:10.078544   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.079196   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.080924   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.081452   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:10.082613   16365 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:10.086143   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:10.086153   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:10.148419   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:10.148437   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:12.676458   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:12.687083   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:12.687155   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:12.712590   54807 cri.go:89] found id: ""
	I1202 19:17:12.712604   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.712611   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:12.712616   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:12.712674   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:12.737565   54807 cri.go:89] found id: ""
	I1202 19:17:12.737578   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.737585   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:12.737591   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:12.737648   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:12.762201   54807 cri.go:89] found id: ""
	I1202 19:17:12.762216   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.762223   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:12.762228   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:12.762288   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:12.786736   54807 cri.go:89] found id: ""
	I1202 19:17:12.786750   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.786758   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:12.786763   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:12.786825   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:12.811994   54807 cri.go:89] found id: ""
	I1202 19:17:12.812008   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.812015   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:12.812020   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:12.812078   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:12.838580   54807 cri.go:89] found id: ""
	I1202 19:17:12.838593   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.838600   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:12.838605   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:12.838659   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:12.863652   54807 cri.go:89] found id: ""
	I1202 19:17:12.863665   54807 logs.go:282] 0 containers: []
	W1202 19:17:12.863672   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:12.863679   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:12.863689   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:12.918766   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:12.918784   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:12.930406   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:12.930428   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:13.000633   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:12.992135   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.992970   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994592   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994977   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.996559   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:12.992135   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.992970   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994592   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.994977   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:12.996559   16467 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:13.000643   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:13.000655   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:13.065384   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:13.065403   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:15.594382   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:15.604731   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:15.604795   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:15.634332   54807 cri.go:89] found id: ""
	I1202 19:17:15.634345   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.634353   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:15.634358   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:15.634434   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:15.663126   54807 cri.go:89] found id: ""
	I1202 19:17:15.663141   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.663148   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:15.663153   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:15.663217   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:15.699033   54807 cri.go:89] found id: ""
	I1202 19:17:15.699051   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.699059   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:15.699065   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:15.699121   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:15.727044   54807 cri.go:89] found id: ""
	I1202 19:17:15.727057   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.727065   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:15.727071   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:15.727129   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:15.754131   54807 cri.go:89] found id: ""
	I1202 19:17:15.754152   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.754159   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:15.754165   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:15.754224   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:15.778325   54807 cri.go:89] found id: ""
	I1202 19:17:15.778338   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.778345   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:15.778350   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:15.778407   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:15.803363   54807 cri.go:89] found id: ""
	I1202 19:17:15.803376   54807 logs.go:282] 0 containers: []
	W1202 19:17:15.803383   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:15.803391   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:15.803403   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:15.814039   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:15.814055   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:15.885494   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:15.877151   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.877774   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.878766   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880347   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880955   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:15.877151   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.877774   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.878766   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880347   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:15.880955   16570 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:15.885505   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:15.885516   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:15.947276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:15.947295   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:15.979963   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:15.979981   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:18.538313   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:18.548423   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:18.548490   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:18.571700   54807 cri.go:89] found id: ""
	I1202 19:17:18.571714   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.571721   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:18.571726   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:18.571784   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:18.600197   54807 cri.go:89] found id: ""
	I1202 19:17:18.600211   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.600219   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:18.600224   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:18.600279   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:18.628309   54807 cri.go:89] found id: ""
	I1202 19:17:18.628341   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.628348   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:18.628353   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:18.628440   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:18.654241   54807 cri.go:89] found id: ""
	I1202 19:17:18.654255   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.654263   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:18.654268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:18.654325   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:18.690109   54807 cri.go:89] found id: ""
	I1202 19:17:18.690123   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.690130   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:18.690135   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:18.690194   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:18.719625   54807 cri.go:89] found id: ""
	I1202 19:17:18.719638   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.719646   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:18.719651   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:18.719713   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:18.753094   54807 cri.go:89] found id: ""
	I1202 19:17:18.753108   54807 logs.go:282] 0 containers: []
	W1202 19:17:18.753116   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:18.753124   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:18.753135   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:18.782592   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:18.782608   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:18.837738   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:18.837757   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:18.848921   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:18.848937   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:18.918012   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:18.908756   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.909735   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.911590   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.912295   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.913925   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:18.908756   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.909735   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.911590   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.912295   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:18.913925   16691 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:18.918023   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:18.918034   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:21.481252   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:21.491493   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:21.491550   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:21.515967   54807 cri.go:89] found id: ""
	I1202 19:17:21.515980   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.515987   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:21.515993   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:21.516049   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:21.545239   54807 cri.go:89] found id: ""
	I1202 19:17:21.545256   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.545263   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:21.545268   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:21.545349   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:21.574561   54807 cri.go:89] found id: ""
	I1202 19:17:21.574575   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.574582   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:21.574588   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:21.574643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:21.600546   54807 cri.go:89] found id: ""
	I1202 19:17:21.600567   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.600575   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:21.600581   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:21.600647   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:21.625602   54807 cri.go:89] found id: ""
	I1202 19:17:21.625616   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.625623   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:21.625629   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:21.625691   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:21.650573   54807 cri.go:89] found id: ""
	I1202 19:17:21.650586   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.650593   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:21.650599   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:21.650655   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:21.680099   54807 cri.go:89] found id: ""
	I1202 19:17:21.680113   54807 logs.go:282] 0 containers: []
	W1202 19:17:21.680120   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:21.680128   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:21.680155   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:21.750582   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:21.750601   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:21.762564   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:21.762580   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:21.827497   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:21.819360   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.820080   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.821817   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.822472   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.824049   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:21.819360   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.820080   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.821817   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.822472   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:21.824049   16787 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:21.827507   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:21.827518   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:21.889794   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:21.889812   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:24.421754   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:24.432162   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:24.432233   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:24.456800   54807 cri.go:89] found id: ""
	I1202 19:17:24.456814   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.456821   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:24.456826   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:24.456901   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:24.481502   54807 cri.go:89] found id: ""
	I1202 19:17:24.481516   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.481523   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:24.481529   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:24.481587   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:24.505876   54807 cri.go:89] found id: ""
	I1202 19:17:24.505918   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.505925   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:24.505931   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:24.505990   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:24.530651   54807 cri.go:89] found id: ""
	I1202 19:17:24.530665   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.530673   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:24.530689   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:24.530749   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:24.556247   54807 cri.go:89] found id: ""
	I1202 19:17:24.556260   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.556277   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:24.556283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:24.556391   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:24.585748   54807 cri.go:89] found id: ""
	I1202 19:17:24.585761   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.585769   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:24.585774   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:24.585833   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:24.610350   54807 cri.go:89] found id: ""
	I1202 19:17:24.610363   54807 logs.go:282] 0 containers: []
	W1202 19:17:24.610370   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:24.610377   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:24.610388   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:24.680866   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:24.664630   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.665302   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667106   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667645   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.669330   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:24.664630   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.665302   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667106   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.667645   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:24.669330   16879 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:24.680876   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:24.680887   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:24.756955   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:24.756975   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:24.784854   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:24.784869   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:24.849848   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:24.849872   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:27.361613   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:27.375047   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:27.375145   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:27.399753   54807 cri.go:89] found id: ""
	I1202 19:17:27.399767   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.399774   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:27.399780   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:27.399838   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:27.430016   54807 cri.go:89] found id: ""
	I1202 19:17:27.430030   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.430037   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:27.430043   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:27.430102   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:27.455165   54807 cri.go:89] found id: ""
	I1202 19:17:27.455178   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.455186   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:27.455191   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:27.455251   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:27.481353   54807 cri.go:89] found id: ""
	I1202 19:17:27.481367   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.481374   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:27.481380   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:27.481437   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:27.505602   54807 cri.go:89] found id: ""
	I1202 19:17:27.505615   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.505622   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:27.505627   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:27.505685   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:27.531062   54807 cri.go:89] found id: ""
	I1202 19:17:27.531075   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.531082   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:27.531087   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:27.531143   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:27.556614   54807 cri.go:89] found id: ""
	I1202 19:17:27.556628   54807 logs.go:282] 0 containers: []
	W1202 19:17:27.556635   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:27.556642   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:27.556653   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:27.623535   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:27.615982   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.616782   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618353   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618650   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.620147   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:27.615982   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.616782   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618353   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.618650   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:27.620147   16984 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:27.623546   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:27.623557   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:27.692276   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:27.692294   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:27.728468   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:27.728489   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:27.790653   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:27.790670   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:30.302100   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:30.313066   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:30.313144   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:30.340123   54807 cri.go:89] found id: ""
	I1202 19:17:30.340137   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.340144   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:30.340149   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:30.340208   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:30.365806   54807 cri.go:89] found id: ""
	I1202 19:17:30.365820   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.365835   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:30.365841   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:30.365904   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:30.391688   54807 cri.go:89] found id: ""
	I1202 19:17:30.391701   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.391708   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:30.391714   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:30.391771   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:30.416982   54807 cri.go:89] found id: ""
	I1202 19:17:30.416996   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.417013   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:30.417019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:30.417117   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:30.443139   54807 cri.go:89] found id: ""
	I1202 19:17:30.443153   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.443162   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:30.443168   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:30.443226   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:30.468557   54807 cri.go:89] found id: ""
	I1202 19:17:30.468571   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.468579   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:30.468584   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:30.468641   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:30.494467   54807 cri.go:89] found id: ""
	I1202 19:17:30.494480   54807 logs.go:282] 0 containers: []
	W1202 19:17:30.494488   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:30.494502   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:30.494515   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:30.551986   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:30.552005   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:30.563168   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:30.563184   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:30.628562   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:30.620608   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.621322   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623022   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623454   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.625008   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:30.620608   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.621322   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623022   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.623454   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:30.625008   17094 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:30.628573   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:30.628584   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:30.691460   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:30.691478   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:33.223672   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:33.234425   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:33.234485   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:33.262500   54807 cri.go:89] found id: ""
	I1202 19:17:33.262514   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.262521   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:33.262527   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:33.262590   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:33.287888   54807 cri.go:89] found id: ""
	I1202 19:17:33.287902   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.287921   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:33.287926   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:33.287995   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:33.314581   54807 cri.go:89] found id: ""
	I1202 19:17:33.314594   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.314601   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:33.314607   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:33.314671   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:33.338734   54807 cri.go:89] found id: ""
	I1202 19:17:33.338747   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.338755   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:33.338760   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:33.338818   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:33.363343   54807 cri.go:89] found id: ""
	I1202 19:17:33.363356   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.363363   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:33.363369   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:33.363425   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:33.388256   54807 cri.go:89] found id: ""
	I1202 19:17:33.388270   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.388277   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:33.388283   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:33.388360   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:33.412424   54807 cri.go:89] found id: ""
	I1202 19:17:33.412449   54807 logs.go:282] 0 containers: []
	W1202 19:17:33.412456   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:33.412465   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:33.412475   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:33.467817   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:33.467835   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:33.479194   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:33.479209   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:33.548484   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:33.540299   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.540851   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.542579   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.543074   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.544871   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:33.540299   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.540851   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.542579   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.543074   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:33.544871   17201 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:33.548494   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:33.548505   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:33.612889   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:33.612909   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:36.146985   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:36.158019   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:36.158079   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:36.188906   54807 cri.go:89] found id: ""
	I1202 19:17:36.188919   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.188932   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:36.188938   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:36.188996   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:36.213390   54807 cri.go:89] found id: ""
	I1202 19:17:36.213404   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.213411   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:36.213416   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:36.213481   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:36.242801   54807 cri.go:89] found id: ""
	I1202 19:17:36.242814   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.242822   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:36.242827   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:36.242882   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:36.269121   54807 cri.go:89] found id: ""
	I1202 19:17:36.269142   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.269149   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:36.269155   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:36.269212   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:36.295182   54807 cri.go:89] found id: ""
	I1202 19:17:36.295196   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.295203   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:36.295208   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:36.295265   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:36.320684   54807 cri.go:89] found id: ""
	I1202 19:17:36.320698   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.320705   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:36.320711   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:36.320783   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:36.347524   54807 cri.go:89] found id: ""
	I1202 19:17:36.347537   54807 logs.go:282] 0 containers: []
	W1202 19:17:36.347545   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:36.347553   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:36.347564   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:36.358349   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:36.358364   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:36.419970   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:36.412170   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.412950   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414493   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414845   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.416368   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:36.412170   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.412950   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414493   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.414845   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:36.416368   17302 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:36.419980   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:36.419991   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:36.482180   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:36.482199   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:36.511443   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:36.511458   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:39.067437   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:39.077694   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:39.077763   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:39.102742   54807 cri.go:89] found id: ""
	I1202 19:17:39.102755   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.102762   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:39.102768   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:39.102824   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:39.127352   54807 cri.go:89] found id: ""
	I1202 19:17:39.127365   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.127371   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:39.127376   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:39.127433   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:39.155704   54807 cri.go:89] found id: ""
	I1202 19:17:39.155717   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.155725   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:39.155730   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:39.155793   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:39.181102   54807 cri.go:89] found id: ""
	I1202 19:17:39.181121   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.181128   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:39.181133   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:39.181193   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:39.204855   54807 cri.go:89] found id: ""
	I1202 19:17:39.204869   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.204876   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:39.204881   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:39.204936   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:39.228875   54807 cri.go:89] found id: ""
	I1202 19:17:39.228889   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.228896   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:39.228901   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:39.228961   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:39.254647   54807 cri.go:89] found id: ""
	I1202 19:17:39.254661   54807 logs.go:282] 0 containers: []
	W1202 19:17:39.254668   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:39.254681   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:39.254696   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:39.266611   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:39.266628   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:39.329195   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:39.321572   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.322010   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323495   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323811   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.325283   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:39.321572   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.322010   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323495   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.323811   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:39.325283   17406 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:39.329204   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:39.329215   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:39.390326   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:39.390345   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:39.419151   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:39.419176   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:41.975528   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:41.989057   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:41.989132   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:42.018363   54807 cri.go:89] found id: ""
	I1202 19:17:42.018376   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.018384   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:42.018390   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:42.018453   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:42.045176   54807 cri.go:89] found id: ""
	I1202 19:17:42.045192   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.045200   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:42.045206   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:42.045290   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:42.075758   54807 cri.go:89] found id: ""
	I1202 19:17:42.075773   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.075781   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:42.075787   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:42.075856   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:42.111739   54807 cri.go:89] found id: ""
	I1202 19:17:42.111754   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.111760   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:42.111767   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:42.111829   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:42.141340   54807 cri.go:89] found id: ""
	I1202 19:17:42.141358   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.141368   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:42.141374   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:42.141453   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:42.171125   54807 cri.go:89] found id: ""
	I1202 19:17:42.171140   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.171159   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:42.171166   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:42.171236   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:42.200254   54807 cri.go:89] found id: ""
	I1202 19:17:42.200272   54807 logs.go:282] 0 containers: []
	W1202 19:17:42.200280   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:42.200292   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:42.200307   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:42.256751   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:42.256772   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:42.269101   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:42.269118   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:42.336339   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:42.328432   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.329095   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.330828   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.331361   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.332853   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:42.328432   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.329095   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.330828   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.331361   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:42.332853   17515 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:42.336350   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:42.336361   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:42.397522   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:42.397540   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:44.932481   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:44.944310   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:44.944439   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:44.992545   54807 cri.go:89] found id: ""
	I1202 19:17:44.992561   54807 logs.go:282] 0 containers: []
	W1202 19:17:44.992568   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:44.992574   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:44.992643   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:45.041739   54807 cri.go:89] found id: ""
	I1202 19:17:45.041756   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.041764   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:45.041770   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:45.041849   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:45.083378   54807 cri.go:89] found id: ""
	I1202 19:17:45.083394   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.083402   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:45.083407   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:45.083483   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:45.119179   54807 cri.go:89] found id: ""
	I1202 19:17:45.119206   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.119214   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:45.119220   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:45.119340   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:45.156515   54807 cri.go:89] found id: ""
	I1202 19:17:45.156574   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.156583   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:45.156590   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:45.156760   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:45.195862   54807 cri.go:89] found id: ""
	I1202 19:17:45.195877   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.195885   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:45.195892   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:45.195968   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:45.229425   54807 cri.go:89] found id: ""
	I1202 19:17:45.229448   54807 logs.go:282] 0 containers: []
	W1202 19:17:45.229457   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:45.229466   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:45.229477   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:45.293109   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:45.293125   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:45.303969   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:45.303985   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:45.371653   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:45.362842   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.363639   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.365366   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.366008   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.367671   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:45.362842   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.363639   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.365366   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.366008   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:45.367671   17616 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:45.371662   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:45.371673   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:45.436450   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:45.436469   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:47.967684   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:47.979933   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:17:47.980001   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:17:48.006489   54807 cri.go:89] found id: ""
	I1202 19:17:48.006503   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.006511   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:17:48.006517   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:17:48.006580   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:17:48.035723   54807 cri.go:89] found id: ""
	I1202 19:17:48.035737   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.035745   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:17:48.035760   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:17:48.035820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:17:48.065220   54807 cri.go:89] found id: ""
	I1202 19:17:48.065233   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.065251   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:17:48.065260   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:17:48.065332   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:17:48.088782   54807 cri.go:89] found id: ""
	I1202 19:17:48.088796   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.088803   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:17:48.088809   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:17:48.088865   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:17:48.113775   54807 cri.go:89] found id: ""
	I1202 19:17:48.113788   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.113799   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:17:48.113808   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:17:48.113867   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:17:48.140235   54807 cri.go:89] found id: ""
	I1202 19:17:48.140248   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.140254   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:17:48.140260   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:17:48.140315   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:17:48.166089   54807 cri.go:89] found id: ""
	I1202 19:17:48.166102   54807 logs.go:282] 0 containers: []
	W1202 19:17:48.166108   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:17:48.166116   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:17:48.166126   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:17:48.192826   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:17:48.192842   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:17:48.248078   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:17:48.248098   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:17:48.258722   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:17:48.258737   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:17:48.323436   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:17:48.316144   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.316536   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318116   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318415   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.319885   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:17:48.316144   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.316536   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318116   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.318415   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:17:48.319885   17731 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:17:48.323445   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:17:48.323456   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 19:17:50.885477   54807 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:17:50.895878   54807 kubeadm.go:602] duration metric: took 4m3.997047772s to restartPrimaryControlPlane
	W1202 19:17:50.895945   54807 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1202 19:17:50.896022   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 19:17:51.304711   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:17:51.317725   54807 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 19:17:51.325312   54807 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 19:17:51.325381   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:17:51.332895   54807 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 19:17:51.332904   54807 kubeadm.go:158] found existing configuration files:
	
	I1202 19:17:51.332954   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:17:51.340776   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 19:17:51.340830   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 19:17:51.348141   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:17:51.355804   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 19:17:51.355867   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:17:51.363399   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:17:51.371055   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 19:17:51.371110   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:17:51.378528   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:17:51.386558   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 19:17:51.386618   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:17:51.394349   54807 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 19:17:51.435339   54807 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 19:17:51.435446   54807 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 19:17:51.512672   54807 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 19:17:51.512738   54807 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 19:17:51.512772   54807 kubeadm.go:319] OS: Linux
	I1202 19:17:51.512816   54807 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 19:17:51.512863   54807 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 19:17:51.512909   54807 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 19:17:51.512961   54807 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 19:17:51.513009   54807 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 19:17:51.513055   54807 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 19:17:51.513099   54807 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 19:17:51.513146   54807 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 19:17:51.513190   54807 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 19:17:51.580412   54807 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 19:17:51.580517   54807 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 19:17:51.580607   54807 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 19:17:51.588752   54807 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 19:17:51.594117   54807 out.go:252]   - Generating certificates and keys ...
	I1202 19:17:51.594201   54807 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 19:17:51.594273   54807 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 19:17:51.594354   54807 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 19:17:51.594424   54807 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 19:17:51.594494   54807 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 19:17:51.594547   54807 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 19:17:51.594610   54807 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 19:17:51.594671   54807 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 19:17:51.594744   54807 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 19:17:51.594818   54807 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 19:17:51.594855   54807 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 19:17:51.594910   54807 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 19:17:51.705531   54807 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 19:17:51.854203   54807 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 19:17:52.029847   54807 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 19:17:52.545269   54807 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 19:17:52.727822   54807 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 19:17:52.728412   54807 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 19:17:52.730898   54807 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 19:17:52.734122   54807 out.go:252]   - Booting up control plane ...
	I1202 19:17:52.734222   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 19:17:52.734305   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 19:17:52.734375   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 19:17:52.754118   54807 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 19:17:52.754386   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 19:17:52.762146   54807 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 19:17:52.762405   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 19:17:52.762460   54807 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 19:17:52.891581   54807 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 19:17:52.891694   54807 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 19:21:52.892779   54807 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001197768s
	I1202 19:21:52.892808   54807 kubeadm.go:319] 
	I1202 19:21:52.892871   54807 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 19:21:52.892903   54807 kubeadm.go:319] 	- The kubelet is not running
	I1202 19:21:52.893025   54807 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 19:21:52.893030   54807 kubeadm.go:319] 
	I1202 19:21:52.893133   54807 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 19:21:52.893170   54807 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 19:21:52.893200   54807 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 19:21:52.893203   54807 kubeadm.go:319] 
	I1202 19:21:52.897451   54807 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 19:21:52.897878   54807 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 19:21:52.897986   54807 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 19:21:52.898220   54807 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 19:21:52.898225   54807 kubeadm.go:319] 
	I1202 19:21:52.898299   54807 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1202 19:21:52.898412   54807 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001197768s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1202 19:21:52.898501   54807 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 19:21:53.323346   54807 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:21:53.337542   54807 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 19:21:53.337600   54807 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 19:21:53.345331   54807 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 19:21:53.345341   54807 kubeadm.go:158] found existing configuration files:
	
	I1202 19:21:53.345394   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf
	I1202 19:21:53.352948   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 19:21:53.353002   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 19:21:53.360251   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf
	I1202 19:21:53.367769   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 19:21:53.367833   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 19:21:53.375319   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf
	I1202 19:21:53.383107   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 19:21:53.383164   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 19:21:53.390823   54807 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf
	I1202 19:21:53.398923   54807 kubeadm.go:164] "https://control-plane.minikube.internal:8441" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8441 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 19:21:53.398982   54807 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 19:21:53.406858   54807 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 19:21:53.455640   54807 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 19:21:53.455689   54807 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 19:21:53.530940   54807 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 19:21:53.531008   54807 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 19:21:53.531042   54807 kubeadm.go:319] OS: Linux
	I1202 19:21:53.531086   54807 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 19:21:53.531133   54807 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 19:21:53.531179   54807 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 19:21:53.531226   54807 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 19:21:53.531273   54807 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 19:21:53.531320   54807 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 19:21:53.531364   54807 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 19:21:53.531410   54807 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 19:21:53.531455   54807 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 19:21:53.605461   54807 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 19:21:53.605584   54807 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 19:21:53.605706   54807 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 19:21:53.611090   54807 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 19:21:53.616552   54807 out.go:252]   - Generating certificates and keys ...
	I1202 19:21:53.616667   54807 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 19:21:53.616734   54807 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 19:21:53.616826   54807 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 19:21:53.616887   54807 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 19:21:53.616955   54807 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 19:21:53.617008   54807 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 19:21:53.617070   54807 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 19:21:53.617132   54807 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 19:21:53.617207   54807 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 19:21:53.617278   54807 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 19:21:53.617314   54807 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 19:21:53.617369   54807 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 19:21:53.704407   54807 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 19:21:53.921613   54807 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 19:21:54.521217   54807 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 19:21:54.609103   54807 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 19:21:54.800380   54807 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 19:21:54.800923   54807 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 19:21:54.803676   54807 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 19:21:54.806989   54807 out.go:252]   - Booting up control plane ...
	I1202 19:21:54.807091   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 19:21:54.807173   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 19:21:54.807243   54807 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 19:21:54.831648   54807 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 19:21:54.831750   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 19:21:54.839547   54807 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 19:21:54.840014   54807 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 19:21:54.840081   54807 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 19:21:54.986075   54807 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 19:21:54.986189   54807 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 19:25:54.986676   54807 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001082452s
	I1202 19:25:54.986700   54807 kubeadm.go:319] 
	I1202 19:25:54.986752   54807 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 19:25:54.986782   54807 kubeadm.go:319] 	- The kubelet is not running
	I1202 19:25:54.986880   54807 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 19:25:54.986884   54807 kubeadm.go:319] 
	I1202 19:25:54.986982   54807 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 19:25:54.987011   54807 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 19:25:54.987040   54807 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 19:25:54.987043   54807 kubeadm.go:319] 
	I1202 19:25:54.991498   54807 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 19:25:54.991923   54807 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 19:25:54.992031   54807 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 19:25:54.992264   54807 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 19:25:54.992269   54807 kubeadm.go:319] 
	I1202 19:25:54.992355   54807 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1202 19:25:54.992407   54807 kubeadm.go:403] duration metric: took 12m8.130118214s to StartCluster
	I1202 19:25:54.992437   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 19:25:54.992498   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 19:25:55.018059   54807 cri.go:89] found id: ""
	I1202 19:25:55.018073   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.018079   54807 logs.go:284] No container was found matching "kube-apiserver"
	I1202 19:25:55.018085   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 19:25:55.018141   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 19:25:55.046728   54807 cri.go:89] found id: ""
	I1202 19:25:55.046741   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.046749   54807 logs.go:284] No container was found matching "etcd"
	I1202 19:25:55.046755   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 19:25:55.046820   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 19:25:55.073607   54807 cri.go:89] found id: ""
	I1202 19:25:55.073621   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.073629   54807 logs.go:284] No container was found matching "coredns"
	I1202 19:25:55.073638   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 19:25:55.073698   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 19:25:55.098149   54807 cri.go:89] found id: ""
	I1202 19:25:55.098163   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.098170   54807 logs.go:284] No container was found matching "kube-scheduler"
	I1202 19:25:55.098175   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 19:25:55.098231   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 19:25:55.126700   54807 cri.go:89] found id: ""
	I1202 19:25:55.126714   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.126721   54807 logs.go:284] No container was found matching "kube-proxy"
	I1202 19:25:55.126727   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 19:25:55.126783   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 19:25:55.151684   54807 cri.go:89] found id: ""
	I1202 19:25:55.151697   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.151704   54807 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 19:25:55.151718   54807 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 19:25:55.151776   54807 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 19:25:55.179814   54807 cri.go:89] found id: ""
	I1202 19:25:55.179827   54807 logs.go:282] 0 containers: []
	W1202 19:25:55.179834   54807 logs.go:284] No container was found matching "kindnet"
	I1202 19:25:55.179842   54807 logs.go:123] Gathering logs for container status ...
	I1202 19:25:55.179852   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 19:25:55.209677   54807 logs.go:123] Gathering logs for kubelet ...
	I1202 19:25:55.209693   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 19:25:55.267260   54807 logs.go:123] Gathering logs for dmesg ...
	I1202 19:25:55.267277   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 19:25:55.278280   54807 logs.go:123] Gathering logs for describe nodes ...
	I1202 19:25:55.278301   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 19:25:55.341995   54807 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:25:55.334367   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.334759   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336266   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336611   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.338027   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	E1202 19:25:55.334367   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.334759   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336266   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.336611   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:25:55.338027   21526 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 19:25:55.342006   54807 logs.go:123] Gathering logs for containerd ...
	I1202 19:25:55.342016   54807 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	W1202 19:25:55.404636   54807 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1202 19:25:55.404681   54807 out.go:285] * 
	W1202 19:25:55.404792   54807 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 19:25:55.404837   54807 out.go:285] * 
	W1202 19:25:55.406981   54807 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 19:25:55.412566   54807 out.go:203] 
	W1202 19:25:55.416194   54807 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001082452s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 19:25:55.416239   54807 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1202 19:25:55.416259   54807 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1202 19:25:55.420152   54807 out.go:203] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578599853Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578662622Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578725071Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578783803Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578855024Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578924119Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.578991820Z" level=info msg="runtime interface created"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579043832Z" level=info msg="created NRI interface"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579105847Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579207451Z" level=info msg="Connect containerd service"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.579595759Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.580416453Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590441353Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590507150Z" level=info msg=serving... address=/run/containerd/containerd.sock
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590537673Z" level=info msg="Start subscribing containerd event"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.590591277Z" level=info msg="Start recovering state"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614386130Z" level=info msg="Start event monitor"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614577326Z" level=info msg="Start cni network conf syncer for default"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614677601Z" level=info msg="Start streaming server"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614762451Z" level=info msg="Registered namespace \"k8s.io\" with NRI"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.614968071Z" level=info msg="runtime interface starting up..."
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615037774Z" level=info msg="starting plugins..."
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615100272Z" level=info msg="Synchronizing NRI (plugin) with current runtime state"
	Dec 02 19:13:45 functional-449836 containerd[10263]: time="2025-12-02T19:13:45.615329048Z" level=info msg="containerd successfully booted in 0.058232s"
	Dec 02 19:13:45 functional-449836 systemd[1]: Started containerd.service - containerd container runtime.
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:27:38.529759   22927 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:38.530338   22927 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:38.531850   22927 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:38.532488   22927 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:27:38.534002   22927 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:27:38 up  1:09,  0 user,  load average: 0.39, 0.22, 0.34
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:27:34 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:35 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 454.
	Dec 02 19:27:35 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:35 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:35 functional-449836 kubelet[22810]: E1202 19:27:35.716876   22810 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:35 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:35 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:36 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 455.
	Dec 02 19:27:36 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:36 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:36 functional-449836 kubelet[22816]: E1202 19:27:36.472636   22816 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:36 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:36 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:37 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 456.
	Dec 02 19:27:37 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:37 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:37 functional-449836 kubelet[22821]: E1202 19:27:37.246356   22821 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:37 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:37 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:27:37 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 457.
	Dec 02 19:27:37 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:37 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:27:37 functional-449836 kubelet[22843]: E1202 19:27:37.996419   22843 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:27:37 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:27:37 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (371.464511ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmdConnect (2.49s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim (241.69s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1202 19:26:13.975239    4435 retry.go:31] will retry after 3.684129375s: Temporary Error: Get "http://10.101.95.205": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1202 19:26:27.659636    4435 retry.go:31] will retry after 2.66535472s: Temporary Error: Get "http://10.101.95.205": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1202 19:26:40.326156    4435 retry.go:31] will retry after 6.8063635s: Temporary Error: Get "http://10.101.95.205": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1202 19:26:57.133560    4435 retry.go:31] will retry after 11.098333659s: Temporary Error: Get "http://10.101.95.205": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
I1202 19:27:18.232218    4435 retry.go:31] will retry after 8.475451752s: Temporary Error: Get "http://10.101.95.205": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1202 19:28:00.704052    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
E1202 19:28:49.133471    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: Get "https://192.168.49.2:8441/api/v1/namespaces/kube-system/pods?labelSelector=integration-test%3Dstorage-provisioner": dial tcp 192.168.49.2:8441: connect: connection refused
helpers_test.go:337: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: WARNING: pod list for "kube-system" "integration-test=storage-provisioner" returned: client rate limiter Wait returned an error: context deadline exceeded
functional_test_pvc_test.go:50: ***** TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim: pod "integration-test=storage-provisioner" failed to start within 4m0s: context deadline exceeded ****
functional_test_pvc_test.go:50: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
functional_test_pvc_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (308.427981ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
functional_test_pvc_test.go:50: status error: exit status 2 (may be ok)
functional_test_pvc_test.go:50: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
functional_test_pvc_test.go:51: failed waiting for storage-provisioner: integration-test=storage-provisioner within 4m0s: context deadline exceeded
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (333.195342ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim logs: 
-- stdout --
	
	==> Audit <==
	┌────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│    COMMAND     │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├────────────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ image          │ functional-449836 image load --daemon kicbase/echo-server:functional-449836 --alsologtostderr                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image ls                                                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image save kicbase/echo-server:functional-449836 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image rm kicbase/echo-server:functional-449836 --alsologtostderr                                                                              │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image ls                                                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image ls                                                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image save --daemon kicbase/echo-server:functional-449836 --alsologtostderr                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ ssh            │ functional-449836 ssh sudo cat /etc/test/nested/copy/4435/hosts                                                                                                 │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ ssh            │ functional-449836 ssh sudo cat /etc/ssl/certs/4435.pem                                                                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ ssh            │ functional-449836 ssh sudo cat /usr/share/ca-certificates/4435.pem                                                                                              │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ ssh            │ functional-449836 ssh sudo cat /etc/ssl/certs/51391683.0                                                                                                        │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ ssh            │ functional-449836 ssh sudo cat /etc/ssl/certs/44352.pem                                                                                                         │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ ssh            │ functional-449836 ssh sudo cat /usr/share/ca-certificates/44352.pem                                                                                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ ssh            │ functional-449836 ssh sudo cat /etc/ssl/certs/3ec20f2e.0                                                                                                        │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image ls --format short --alsologtostderr                                                                                                     │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image ls --format yaml --alsologtostderr                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ ssh            │ functional-449836 ssh pgrep buildkitd                                                                                                                           │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │                     │
	│ image          │ functional-449836 image build -t localhost/my-image:functional-449836 testdata/build --alsologtostderr                                                          │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image ls                                                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image ls --format json --alsologtostderr                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image          │ functional-449836 image ls --format table --alsologtostderr                                                                                                     │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ update-context │ functional-449836 update-context --alsologtostderr -v=2                                                                                                         │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ update-context │ functional-449836 update-context --alsologtostderr -v=2                                                                                                         │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ update-context │ functional-449836 update-context --alsologtostderr -v=2                                                                                                         │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	└────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:27:54
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:27:54.334705   72077 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:27:54.334826   72077 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:27:54.334832   72077 out.go:374] Setting ErrFile to fd 2...
	I1202 19:27:54.334838   72077 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:27:54.335121   72077 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:27:54.335475   72077 out.go:368] Setting JSON to false
	I1202 19:27:54.336281   72077 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":4211,"bootTime":1764699464,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:27:54.336407   72077 start.go:143] virtualization:  
	I1202 19:27:54.339578   72077 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:27:54.342634   72077 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:27:54.342710   72077 notify.go:221] Checking for updates...
	I1202 19:27:54.349520   72077 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:27:54.352529   72077 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:27:54.355602   72077 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:27:54.358596   72077 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:27:54.361481   72077 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:27:54.364941   72077 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:27:54.365634   72077 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:27:54.401377   72077 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:27:54.401513   72077 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:27:54.491357   72077 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:27:54.481540534 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:27:54.491466   72077 docker.go:319] overlay module found
	I1202 19:27:54.494502   72077 out.go:179] * Using the docker driver based on existing profile
	I1202 19:27:54.497312   72077 start.go:309] selected driver: docker
	I1202 19:27:54.497331   72077 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:27:54.497422   72077 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:27:54.497523   72077 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:27:54.562952   72077 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:27:54.553803986 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:27:54.563435   72077 cni.go:84] Creating CNI manager for ""
	I1202 19:27:54.563498   72077 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:27:54.563539   72077 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:27:54.568468   72077 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:27:59 functional-449836 containerd[10263]: time="2025-12-02T19:27:59.225160381Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:27:59 functional-449836 containerd[10263]: time="2025-12-02T19:27:59.225503131Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-449836\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.561233953Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-449836\""
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.563916713Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-449836\""
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.567272763Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.576061638Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-449836\" returns successfully"
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.807006076Z" level=info msg="No images store for sha256:5e6593c39feac926ea1efdfa508752787d84fd519c4ecffe2207444026bb4259"
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.809183414Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-449836\""
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.816580767Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.817441027Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-449836\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:28:01 functional-449836 containerd[10263]: time="2025-12-02T19:28:01.620666230Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-449836\""
	Dec 02 19:28:01 functional-449836 containerd[10263]: time="2025-12-02T19:28:01.623099459Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-449836\""
	Dec 02 19:28:01 functional-449836 containerd[10263]: time="2025-12-02T19:28:01.625242661Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 02 19:28:01 functional-449836 containerd[10263]: time="2025-12-02T19:28:01.634624324Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-449836\" returns successfully"
	Dec 02 19:28:02 functional-449836 containerd[10263]: time="2025-12-02T19:28:02.309920708Z" level=info msg="No images store for sha256:2eab97d6eb1867e5d53bb1d23591fae12b766103f26faf3270cdffd9f5a38c70"
	Dec 02 19:28:02 functional-449836 containerd[10263]: time="2025-12-02T19:28:02.312181572Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-449836\""
	Dec 02 19:28:02 functional-449836 containerd[10263]: time="2025-12-02T19:28:02.321498835Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:28:02 functional-449836 containerd[10263]: time="2025-12-02T19:28:02.322190965Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-449836\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:28:10 functional-449836 containerd[10263]: time="2025-12-02T19:28:10.362439141Z" level=info msg="connecting to shim hqi4ugrsayqj6dtb8c7764g2n" address="unix:///run/containerd/s/0fee019a1c351296b1e5edfdfd37d03b31121c5253b6db9894bb373432ca9723" namespace=k8s.io protocol=ttrpc version=3
	Dec 02 19:28:10 functional-449836 containerd[10263]: time="2025-12-02T19:28:10.455090096Z" level=info msg="shim disconnected" id=hqi4ugrsayqj6dtb8c7764g2n namespace=k8s.io
	Dec 02 19:28:10 functional-449836 containerd[10263]: time="2025-12-02T19:28:10.455131281Z" level=info msg="cleaning up after shim disconnected" id=hqi4ugrsayqj6dtb8c7764g2n namespace=k8s.io
	Dec 02 19:28:10 functional-449836 containerd[10263]: time="2025-12-02T19:28:10.455165623Z" level=info msg="cleaning up dead shim" id=hqi4ugrsayqj6dtb8c7764g2n namespace=k8s.io
	Dec 02 19:28:10 functional-449836 containerd[10263]: time="2025-12-02T19:28:10.720842484Z" level=info msg="ImageCreate event name:\"localhost/my-image:functional-449836\""
	Dec 02 19:28:10 functional-449836 containerd[10263]: time="2025-12-02T19:28:10.729200056Z" level=info msg="ImageCreate event name:\"sha256:a30433d368d66ec77b37b7bf10ff31a359d75890cf1752556ca5a3a2233accc8\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:28:10 functional-449836 containerd[10263]: time="2025-12-02T19:28:10.729650811Z" level=info msg="ImageUpdate event name:\"localhost/my-image:functional-449836\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:30:05.648642   25725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:30:05.649430   25725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:30:05.651009   25725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:30:05.651431   25725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:30:05.652890   25725 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:30:05 up  1:12,  0 user,  load average: 0.28, 0.35, 0.37
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:30:02 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:30:02 functional-449836 kubelet[25593]: E1202 19:30:02.715411   25593 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:30:02 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:30:02 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:30:03 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 651.
	Dec 02 19:30:03 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:30:03 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:30:03 functional-449836 kubelet[25598]: E1202 19:30:03.468941   25598 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:30:03 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:30:03 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:30:04 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 652.
	Dec 02 19:30:04 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:30:04 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:30:04 functional-449836 kubelet[25603]: E1202 19:30:04.216939   25603 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:30:04 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:30:04 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:30:04 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 653.
	Dec 02 19:30:04 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:30:04 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:30:04 functional-449836 kubelet[25630]: E1202 19:30:04.993202   25630 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:30:04 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:30:04 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:30:05 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 654.
	Dec 02 19:30:05 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:30:05 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (328.986001ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PersistentVolumeClaim (241.69s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels (1.42s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-449836 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
functional_test.go:234: (dbg) Non-zero exit: kubectl --context functional-449836 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": exit status 1 (66.653317ms)

                                                
                                                
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:236: failed to 'kubectl get nodes' with args "kubectl --context functional-449836 get nodes --output=go-template \"--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'\"": exit status 1
functional_test.go:242: expected to have label "minikube.k8s.io/commit" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/version" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/updated_at" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/name" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
functional_test.go:242: expected to have label "minikube.k8s.io/primary" in node labels but got : 
-- stdout --
	'Error executing template: template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range. Printing more information for debugging the template:
		template was:
			'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'
		raw data was:
			{"apiVersion":"v1","items":[],"kind":"List","metadata":{"resourceVersion":""}}
		object given to template engine was:
			map[apiVersion:v1 items:[] kind:List metadata:map[resourceVersion:]]
	

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?
	error executing template "'{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'": template: output:1:20: executing "output" at <index .items 0>: error calling index: reflect: slice index out of range

                                                
                                                
** /stderr **
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect functional-449836
helpers_test.go:243: (dbg) docker inspect functional-449836:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	        "Created": "2025-12-02T18:58:53.515075222Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 43093,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T18:58:53.587847975Z",
	            "FinishedAt": "0001-01-01T00:00:00Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hostname",
	        "HostsPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/hosts",
	        "LogPath": "/var/lib/docker/containers/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d/6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d-json.log",
	        "Name": "/functional-449836",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "functional-449836:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "functional-449836",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 4294967296,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 8589934592,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "6870f21b62bb6903aca3129f1ce4723cf3f2ffad99a50b164f8e2dc04b50e75d",
	                "LowerDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41ea5a18fbf8709879a7fc4066a6a4a1474aa86e898ee1ccabe5669a1871131d/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            },
	            {
	                "Type": "volume",
	                "Name": "functional-449836",
	                "Source": "/var/lib/docker/volumes/functional-449836/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            }
	        ],
	        "Config": {
	            "Hostname": "functional-449836",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8441/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "functional-449836",
	                "name.minikube.sigs.k8s.io": "functional-449836",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "410fbaab809a56b556195f7ed6eeff8dcd31e9020fb1dbfacf74828b79df3d88",
	            "SandboxKey": "/var/run/docker/netns/410fbaab809a",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32788"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32789"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32792"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32790"
	                    }
	                ],
	                "8441/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "32791"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "functional-449836": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.49.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "82:18:11:ce:46:48",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "3cb7d67d4fa267ddd6b37211325b224fb3fb811be8ff57bda18e19f6929ec9c8",
	                    "EndpointID": "20c8c1a67e53d7615656777f73986a40cb1c6affb22c4db185c479ac85cbdb14",
	                    "Gateway": "192.168.49.1",
	                    "IPAddress": "192.168.49.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "functional-449836",
	                        "6870f21b62bb"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p functional-449836 -n functional-449836: exit status 2 (328.109065ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs -n 25
helpers_test.go:260: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels logs: 
-- stdout --
	
	==> Audit <==
	┌───────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬───────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│  COMMAND  │                                                                              ARGS                                                                               │      PROFILE      │  USER   │ VERSION │     START TIME      │      END TIME       │
	├───────────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼───────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ mount     │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount3 --alsologtostderr -v=1                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ mount     │ -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount2 --alsologtostderr -v=1                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh       │ functional-449836 ssh findmnt -T /mount1                                                                                                                        │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh findmnt -T /mount2                                                                                                                        │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh findmnt -T /mount3                                                                                                                        │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ mount     │ -p functional-449836 --kill=true                                                                                                                                │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ start     │ -p functional-449836 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ start     │ -p functional-449836 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ start     │ -p functional-449836 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0                       │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ dashboard │ --url --port 36195 -p functional-449836 --alsologtostderr -v=1                                                                                                  │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ license   │                                                                                                                                                                 │ minikube          │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ ssh       │ functional-449836 ssh sudo systemctl is-active docker                                                                                                           │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ ssh       │ functional-449836 ssh sudo systemctl is-active crio                                                                                                             │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │                     │
	│ image     │ functional-449836 image load --daemon kicbase/echo-server:functional-449836 --alsologtostderr                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ image     │ functional-449836 image ls                                                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ image     │ functional-449836 image load --daemon kicbase/echo-server:functional-449836 --alsologtostderr                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ image     │ functional-449836 image ls                                                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:27 UTC │
	│ image     │ functional-449836 image load --daemon kicbase/echo-server:functional-449836 --alsologtostderr                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:27 UTC │ 02 Dec 25 19:28 UTC │
	│ image     │ functional-449836 image ls                                                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image     │ functional-449836 image save kicbase/echo-server:functional-449836 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image     │ functional-449836 image rm kicbase/echo-server:functional-449836 --alsologtostderr                                                                              │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image     │ functional-449836 image ls                                                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image     │ functional-449836 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr                                       │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image     │ functional-449836 image ls                                                                                                                                      │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	│ image     │ functional-449836 image save --daemon kicbase/echo-server:functional-449836 --alsologtostderr                                                                   │ functional-449836 │ jenkins │ v1.37.0 │ 02 Dec 25 19:28 UTC │ 02 Dec 25 19:28 UTC │
	└───────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴───────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 19:27:54
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 19:27:54.334705   72077 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:27:54.334826   72077 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:27:54.334832   72077 out.go:374] Setting ErrFile to fd 2...
	I1202 19:27:54.334838   72077 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:27:54.335121   72077 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:27:54.335475   72077 out.go:368] Setting JSON to false
	I1202 19:27:54.336281   72077 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":4211,"bootTime":1764699464,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:27:54.336407   72077 start.go:143] virtualization:  
	I1202 19:27:54.339578   72077 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:27:54.342634   72077 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:27:54.342710   72077 notify.go:221] Checking for updates...
	I1202 19:27:54.349520   72077 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:27:54.352529   72077 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:27:54.355602   72077 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:27:54.358596   72077 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:27:54.361481   72077 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:27:54.364941   72077 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:27:54.365634   72077 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:27:54.401377   72077 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:27:54.401513   72077 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:27:54.491357   72077 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:27:54.481540534 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:27:54.491466   72077 docker.go:319] overlay module found
	I1202 19:27:54.494502   72077 out.go:179] * Using the docker driver based on existing profile
	I1202 19:27:54.497312   72077 start.go:309] selected driver: docker
	I1202 19:27:54.497331   72077 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:27:54.497422   72077 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:27:54.497523   72077 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:27:54.562952   72077 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:27:54.553803986 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:27:54.563435   72077 cni.go:84] Creating CNI manager for ""
	I1202 19:27:54.563498   72077 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:27:54.563539   72077 start.go:353] cluster config:
	{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Disab
leCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:27:54.568468   72077 out.go:179] * dry-run validation complete!
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 19:27:58 functional-449836 containerd[10263]: time="2025-12-02T19:27:58.135644927Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-449836\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:27:58 functional-449836 containerd[10263]: time="2025-12-02T19:27:58.971078825Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-449836\""
	Dec 02 19:27:58 functional-449836 containerd[10263]: time="2025-12-02T19:27:58.974414206Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-449836\""
	Dec 02 19:27:58 functional-449836 containerd[10263]: time="2025-12-02T19:27:58.978187203Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 02 19:27:58 functional-449836 containerd[10263]: time="2025-12-02T19:27:58.986405586Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-449836\" returns successfully"
	Dec 02 19:27:59 functional-449836 containerd[10263]: time="2025-12-02T19:27:59.215670012Z" level=info msg="No images store for sha256:5e6593c39feac926ea1efdfa508752787d84fd519c4ecffe2207444026bb4259"
	Dec 02 19:27:59 functional-449836 containerd[10263]: time="2025-12-02T19:27:59.217862260Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-449836\""
	Dec 02 19:27:59 functional-449836 containerd[10263]: time="2025-12-02T19:27:59.225160381Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:27:59 functional-449836 containerd[10263]: time="2025-12-02T19:27:59.225503131Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-449836\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.561233953Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-449836\""
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.563916713Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-449836\""
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.567272763Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.576061638Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-449836\" returns successfully"
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.807006076Z" level=info msg="No images store for sha256:5e6593c39feac926ea1efdfa508752787d84fd519c4ecffe2207444026bb4259"
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.809183414Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-449836\""
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.816580767Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:28:00 functional-449836 containerd[10263]: time="2025-12-02T19:28:00.817441027Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-449836\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:28:01 functional-449836 containerd[10263]: time="2025-12-02T19:28:01.620666230Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-449836\""
	Dec 02 19:28:01 functional-449836 containerd[10263]: time="2025-12-02T19:28:01.623099459Z" level=info msg="ImageDelete event name:\"docker.io/kicbase/echo-server:functional-449836\""
	Dec 02 19:28:01 functional-449836 containerd[10263]: time="2025-12-02T19:28:01.625242661Z" level=info msg="ImageDelete event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\""
	Dec 02 19:28:01 functional-449836 containerd[10263]: time="2025-12-02T19:28:01.634624324Z" level=info msg="RemoveImage \"kicbase/echo-server:functional-449836\" returns successfully"
	Dec 02 19:28:02 functional-449836 containerd[10263]: time="2025-12-02T19:28:02.309920708Z" level=info msg="No images store for sha256:2eab97d6eb1867e5d53bb1d23591fae12b766103f26faf3270cdffd9f5a38c70"
	Dec 02 19:28:02 functional-449836 containerd[10263]: time="2025-12-02T19:28:02.312181572Z" level=info msg="ImageCreate event name:\"docker.io/kicbase/echo-server:functional-449836\""
	Dec 02 19:28:02 functional-449836 containerd[10263]: time="2025-12-02T19:28:02.321498835Z" level=info msg="ImageCreate event name:\"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	Dec 02 19:28:02 functional-449836 containerd[10263]: time="2025-12-02T19:28:02.322190965Z" level=info msg="ImageUpdate event name:\"docker.io/kicbase/echo-server:functional-449836\"  labels:{key:\"io.cri-containerd.image\"  value:\"managed\"}"
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E1202 19:28:03.951291   24339 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:28:03.951859   24339 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:28:03.953333   24339 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:28:03.953662   24339 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	E1202 19:28:03.955388   24339 memcache.go:265] "Unhandled Error" err="couldn't get current server API group list: Get \"https://localhost:8441/api?timeout=32s\": dial tcp [::1]:8441: connect: connection refused"
	The connection to the server localhost:8441 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 19:28:03 up  1:10,  0 user,  load average: 0.86, 0.35, 0.38
	Linux functional-449836 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 19:28:00 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:28:01 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 488.
	Dec 02 19:28:01 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:28:01 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:28:01 functional-449836 kubelet[24085]: E1202 19:28:01.229576   24085 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:28:01 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:28:01 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:28:01 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 489.
	Dec 02 19:28:01 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:28:01 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:28:01 functional-449836 kubelet[24153]: E1202 19:28:01.995298   24153 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:28:01 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:28:01 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:28:02 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 490.
	Dec 02 19:28:02 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:28:02 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:28:02 functional-449836 kubelet[24201]: E1202 19:28:02.726244   24201 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:28:02 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:28:02 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 19:28:03 functional-449836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 491.
	Dec 02 19:28:03 functional-449836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:28:03 functional-449836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 19:28:03 functional-449836 kubelet[24255]: E1202 19:28:03.484293   24255 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 19:28:03 functional-449836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 19:28:03 functional-449836 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p functional-449836 -n functional-449836: exit status 2 (324.159747ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "functional-449836" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NodeLabels (1.42s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel (0.59s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-449836 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-449836 tunnel --alsologtostderr]
functional_test_tunnel_test.go:190: tunnel command failed with unexpected error: exit code 103. stderr: I1202 19:26:03.379802   67846 out.go:360] Setting OutFile to fd 1 ...
I1202 19:26:03.379992   67846 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:26:03.380002   67846 out.go:374] Setting ErrFile to fd 2...
I1202 19:26:03.380008   67846 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:26:03.380396   67846 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 19:26:03.380685   67846 mustload.go:66] Loading cluster: functional-449836
I1202 19:26:03.381119   67846 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:26:03.381666   67846 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
I1202 19:26:03.432951   67846 host.go:66] Checking if "functional-449836" exists ...
I1202 19:26:03.433269   67846 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1202 19:26:03.588172   67846 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:26:03.573479399 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1202 19:26:03.588286   67846 api_server.go:166] Checking apiserver status ...
I1202 19:26:03.588401   67846 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I1202 19:26:03.588451   67846 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
I1202 19:26:03.622154   67846 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
W1202 19:26:03.739476   67846 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
stdout:

                                                
                                                
stderr:
I1202 19:26:03.743036   67846 out.go:179] * The control-plane node functional-449836 apiserver is not running: (state=Stopped)
I1202 19:26:03.746125   67846 out.go:179]   To start a cluster, run: "minikube start -p functional-449836"

                                                
                                                
stdout: * The control-plane node functional-449836 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-449836"
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-449836 tunnel --alsologtostderr] ...
helpers_test.go:525: unable to kill pid 67847: os: process already finished
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-449836 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-449836 tunnel --alsologtostderr] stderr:
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-449836 tunnel --alsologtostderr] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: read stdout failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-449836 tunnel --alsologtostderr] stdout:
functional_test_tunnel_test.go:194: read stderr failed: read |0: file already closed
functional_test_tunnel_test.go:194: (dbg) [out/minikube-linux-arm64 -p functional-449836 tunnel --alsologtostderr] stderr:
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/RunSecondTunnel (0.59s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup (0.11s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-449836 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:212: (dbg) Non-zero exit: kubectl --context functional-449836 apply -f testdata/testsvc.yaml: exit status 1 (107.250132ms)

                                                
                                                
** stderr ** 
	error: error validating "testdata/testsvc.yaml": error validating data: failed to download openapi: Get "https://192.168.49.2:8441/openapi/v2?timeout=32s": dial tcp 192.168.49.2:8441: connect: connection refused; if you choose to ignore these errors, turn validation off with --validate=false

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:214: kubectl --context functional-449836 apply -f testdata/testsvc.yaml failed: exit status 1
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/WaitService/Setup (0.11s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect (92.8s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:288: failed to hit nginx at "http://10.101.95.205": Temporary Error: Get "http://10.101.95.205": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
functional_test_tunnel_test.go:290: (dbg) Run:  kubectl --context functional-449836 get svc nginx-svc
functional_test_tunnel_test.go:290: (dbg) Non-zero exit: kubectl --context functional-449836 get svc nginx-svc: exit status 1 (66.234026ms)

                                                
                                                
** stderr ** 
	The connection to the server 192.168.49.2:8441 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
functional_test_tunnel_test.go:292: kubectl --context functional-449836 get svc nginx-svc failed: exit status 1
functional_test_tunnel_test.go:294: failed to kubectl get svc nginx-svc:
functional_test_tunnel_test.go:301: expected body to contain "Welcome to nginx!", but got *""*
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessDirect (92.80s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-449836 create deployment hello-node --image kicbase/echo-server
functional_test.go:1451: (dbg) Non-zero exit: kubectl --context functional-449836 create deployment hello-node --image kicbase/echo-server: exit status 1 (61.486156ms)

                                                
                                                
** stderr ** 
	error: failed to create deployment: Post "https://192.168.49.2:8441/apis/apps/v1/namespaces/default/deployments?fieldManager=kubectl-create&fieldValidation=Strict": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test.go:1453: failed to create hello-node deployment with this command "kubectl --context functional-449836 create deployment hello-node --image kicbase/echo-server": exit status 1.
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/DeployApp (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List (0.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 service list
functional_test.go:1469: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 service list: exit status 103 (254.143374ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-449836 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-449836"

                                                
                                                
-- /stdout --
functional_test.go:1471: failed to do service list. args "out/minikube-linux-arm64 -p functional-449836 service list" : exit status 103
functional_test.go:1474: expected 'service list' to contain *hello-node* but got -"* The control-plane node functional-449836 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-449836\"\n"-
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/List (0.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput (0.27s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 service list -o json
functional_test.go:1499: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 service list -o json: exit status 103 (269.398124ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-449836 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-449836"

                                                
                                                
-- /stdout --
functional_test.go:1501: failed to list services with json format. args "out/minikube-linux-arm64 -p functional-449836 service list -o json": exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/JSONOutput (0.27s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS (0.29s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 service --namespace=default --https --url hello-node
functional_test.go:1519: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 service --namespace=default --https --url hello-node: exit status 103 (291.641582ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-449836 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-449836"

                                                
                                                
-- /stdout --
functional_test.go:1521: failed to get service url. args "out/minikube-linux-arm64 -p functional-449836 service --namespace=default --https --url hello-node" : exit status 103
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/HTTPS (0.29s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format (0.26s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 service hello-node --url --format={{.IP}}
functional_test.go:1550: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 service hello-node --url --format={{.IP}}: exit status 103 (262.162864ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-449836 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-449836"

                                                
                                                
-- /stdout --
functional_test.go:1552: failed to get service url with custom format. args "out/minikube-linux-arm64 -p functional-449836 service hello-node --url --format={{.IP}}": exit status 103
functional_test.go:1558: "* The control-plane node functional-449836 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-449836\"" is not a valid IP
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/Format (0.26s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL (0.26s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 service hello-node --url
functional_test.go:1569: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 service hello-node --url: exit status 103 (261.583135ms)

                                                
                                                
-- stdout --
	* The control-plane node functional-449836 apiserver is not running: (state=Stopped)
	  To start a cluster, run: "minikube start -p functional-449836"

                                                
                                                
-- /stdout --
functional_test.go:1571: failed to get service url. args: "out/minikube-linux-arm64 -p functional-449836 service hello-node --url": exit status 103
functional_test.go:1575: found endpoint for hello-node: * The control-plane node functional-449836 apiserver is not running: (state=Stopped)
To start a cluster, run: "minikube start -p functional-449836"
functional_test.go:1579: failed to parse "* The control-plane node functional-449836 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-449836\"": parse "* The control-plane node functional-449836 apiserver is not running: (state=Stopped)\n  To start a cluster, run: \"minikube start -p functional-449836\"": net/url: invalid control character in URL
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ServiceCmd/URL (0.26s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port (2.37s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1764703664487000285" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1764703664487000285" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1764703664487000285" to /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001/test-1764703664487000285
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (384.333285ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1202 19:27:44.871584    4435 retry.go:31] will retry after 413.642868ms: exit status 1
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec  2 19:27 created-by-test
-rw-r--r-- 1 docker docker 24 Dec  2 19:27 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec  2 19:27 test-1764703664487000285
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh cat /mount-9p/test-1764703664487000285
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-449836 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:148: (dbg) Non-zero exit: kubectl --context functional-449836 replace --force -f testdata/busybox-mount-test.yaml: exit status 1 (68.347207ms)

                                                
                                                
** stderr ** 
	error: error when deleting "testdata/busybox-mount-test.yaml": Delete "https://192.168.49.2:8441/api/v1/namespaces/default/pods/busybox-mount": dial tcp 192.168.49.2:8441: connect: connection refused

                                                
                                                
** /stderr **
functional_test_mount_test.go:150: failed to 'kubectl replace' for busybox-mount-test. args "kubectl --context functional-449836 replace --force -f testdata/busybox-mount-test.yaml" : exit status 1
functional_test_mount_test.go:80: "TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port" failed, getting debug info...
functional_test_mount_test.go:81: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates"
functional_test_mount_test.go:81: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates": exit status 1 (279.684062ms)

                                                
                                                
-- stdout --
	192.168.49.1 on /mount-9p type 9p (rw,relatime,sync,dirsync,dfltuid=1000,dfltgid=997,access=any,msize=262144,trans=tcp,noextend,port=35581)
	total 2
	-rw-r--r-- 1 docker docker 24 Dec  2 19:27 created-by-test
	-rw-r--r-- 1 docker docker 24 Dec  2 19:27 created-by-test-removed-by-pod
	-rw-r--r-- 1 docker docker 24 Dec  2 19:27 test-1764703664487000285
	cat: /mount-9p/pod-dates: No such file or directory

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:83: debugging command "out/minikube-linux-arm64 -p functional-449836 ssh \"mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates\"" failed : exit status 1
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001:/mount-9p --alsologtostderr -v=1] ...
functional_test_mount_test.go:94: (dbg) [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001:/mount-9p --alsologtostderr -v=1] stdout:
* Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001 into VM as /mount-9p ...
- Mount type:   9p
- User ID:      docker
- Group ID:     docker
- Version:      9p2000.L
- Message Size: 262144
- Options:      map[]
- Bind Address: 192.168.49.1:35581
* Userspace file server: 
ufs starting
* Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001 to /mount-9p

                                                
                                                
* NOTE: This process must stay alive for the mount to be accessible ...
* Unmounting /mount-9p ...

                                                
                                                

                                                
                                                
functional_test_mount_test.go:94: (dbg) [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001:/mount-9p --alsologtostderr -v=1] stderr:
I1202 19:27:44.538602   70121 out.go:360] Setting OutFile to fd 1 ...
I1202 19:27:44.538803   70121 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:27:44.538821   70121 out.go:374] Setting ErrFile to fd 2...
I1202 19:27:44.538838   70121 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:27:44.539120   70121 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 19:27:44.539433   70121 mustload.go:66] Loading cluster: functional-449836
I1202 19:27:44.539842   70121 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:27:44.540456   70121 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
I1202 19:27:44.577630   70121 host.go:66] Checking if "functional-449836" exists ...
I1202 19:27:44.577913   70121 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I1202 19:27:44.688406   70121 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:27:44.663656493 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
I1202 19:27:44.688554   70121 cli_runner.go:164] Run: docker network inspect functional-449836 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1202 19:27:44.737661   70121 out.go:179] * Mounting host path /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001 into VM as /mount-9p ...
I1202 19:27:44.740616   70121 out.go:179]   - Mount type:   9p
I1202 19:27:44.743552   70121 out.go:179]   - User ID:      docker
I1202 19:27:44.746506   70121 out.go:179]   - Group ID:     docker
I1202 19:27:44.749398   70121 out.go:179]   - Version:      9p2000.L
I1202 19:27:44.752140   70121 out.go:179]   - Message Size: 262144
I1202 19:27:44.755220   70121 out.go:179]   - Options:      map[]
I1202 19:27:44.758130   70121 out.go:179]   - Bind Address: 192.168.49.1:35581
I1202 19:27:44.761435   70121 out.go:179] * Userspace file server: 
I1202 19:27:44.761704   70121 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1202 19:27:44.762001   70121 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
I1202 19:27:44.785680   70121 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
I1202 19:27:44.895588   70121 mount.go:180] unmount for /mount-9p ran successfully
I1202 19:27:44.895621   70121 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /mount-9p"
I1202 19:27:44.904126   70121 ssh_runner.go:195] Run: /bin/bash -c "sudo mount -t 9p -o dfltgid=$(grep ^docker: /etc/group | cut -d: -f3),dfltuid=$(id -u docker),msize=262144,port=35581,trans=tcp,version=9p2000.L 192.168.49.1 /mount-9p"
I1202 19:27:44.914881   70121 main.go:127] stdlog: ufs.go:141 connected
I1202 19:27:44.915037   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tversion tag 65535 msize 262144 version '9P2000.L'
I1202 19:27:44.915076   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rversion tag 65535 msize 262144 version '9P2000'
I1202 19:27:44.915334   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tattach tag 0 fid 0 afid 4294967295 uname 'nobody' nuname 0 aname ''
I1202 19:27:44.915391   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rattach tag 0 aqid (15c38d3 e088f964 'd')
I1202 19:27:44.916086   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 0
I1202 19:27:44.916150   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (15c38d3 e088f964 'd') m d775 at 0 mt 1764703664 l 4096 t 0 d 0 ext )
I1202 19:27:44.920918   70121 lock.go:50] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/.mount-process: {Name:mka65fc20e542da730c67c449a7d60eb4b207713 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1202 19:27:44.921132   70121 mount.go:105] mount successful: ""
I1202 19:27:44.924602   70121 out.go:179] * Successfully mounted /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo2287448557/001 to /mount-9p
I1202 19:27:44.927503   70121 out.go:203] 
I1202 19:27:44.930420   70121 out.go:179] * NOTE: This process must stay alive for the mount to be accessible ...
I1202 19:27:45.840946   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 0
I1202 19:27:45.841026   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (15c38d3 e088f964 'd') m d775 at 0 mt 1764703664 l 4096 t 0 d 0 ext )
I1202 19:27:45.841544   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 0 newfid 1 
I1202 19:27:45.841601   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rwalk tag 0 
I1202 19:27:45.841822   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Topen tag 0 fid 1 mode 0
I1202 19:27:45.841881   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Ropen tag 0 qid (15c38d3 e088f964 'd') iounit 0
I1202 19:27:45.842039   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 0
I1202 19:27:45.842105   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (15c38d3 e088f964 'd') m d775 at 0 mt 1764703664 l 4096 t 0 d 0 ext )
I1202 19:27:45.842298   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 1 offset 0 count 262120
I1202 19:27:45.842445   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 258
I1202 19:27:45.842606   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 1 offset 258 count 261862
I1202 19:27:45.842641   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 0
I1202 19:27:45.842755   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 1 offset 258 count 262120
I1202 19:27:45.842784   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 0
I1202 19:27:45.842925   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1202 19:27:45.842962   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rwalk tag 0 (15c38d4 e088f964 '') 
I1202 19:27:45.843083   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:45.843193   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (15c38d4 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:45.843336   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:45.843372   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (15c38d4 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:45.843486   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 2
I1202 19:27:45.843519   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:45.843661   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 0 newfid 2 0:'test-1764703664487000285' 
I1202 19:27:45.843708   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rwalk tag 0 (15c38d6 e088f964 '') 
I1202 19:27:45.843837   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:45.843874   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('test-1764703664487000285' 'jenkins' 'jenkins' '' q (15c38d6 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:45.844046   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:45.844132   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('test-1764703664487000285' 'jenkins' 'jenkins' '' q (15c38d6 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:45.844427   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 2
I1202 19:27:45.844478   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:45.844662   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1202 19:27:45.844728   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rwalk tag 0 (15c38d5 e088f964 '') 
I1202 19:27:45.844860   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:45.844908   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (15c38d5 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:45.845026   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:45.845064   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (15c38d5 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:45.845202   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 2
I1202 19:27:45.845225   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:45.845348   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 1 offset 258 count 262120
I1202 19:27:45.845377   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 0
I1202 19:27:45.845528   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 1
I1202 19:27:45.845561   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:46.116591   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 0 newfid 1 0:'test-1764703664487000285' 
I1202 19:27:46.116672   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rwalk tag 0 (15c38d6 e088f964 '') 
I1202 19:27:46.116859   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 1
I1202 19:27:46.116903   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('test-1764703664487000285' 'jenkins' 'jenkins' '' q (15c38d6 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:46.117048   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 1 newfid 2 
I1202 19:27:46.117077   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rwalk tag 0 
I1202 19:27:46.117211   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Topen tag 0 fid 2 mode 0
I1202 19:27:46.117273   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Ropen tag 0 qid (15c38d6 e088f964 '') iounit 0
I1202 19:27:46.117406   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 1
I1202 19:27:46.117443   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('test-1764703664487000285' 'jenkins' 'jenkins' '' q (15c38d6 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:46.117601   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 2 offset 0 count 262120
I1202 19:27:46.117641   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 24
I1202 19:27:46.117763   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 2 offset 24 count 262120
I1202 19:27:46.117789   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 0
I1202 19:27:46.117942   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 2 offset 24 count 262120
I1202 19:27:46.117991   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 0
I1202 19:27:46.118202   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 2
I1202 19:27:46.118239   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:46.118427   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 1
I1202 19:27:46.118456   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:46.467269   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 0
I1202 19:27:46.467342   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (15c38d3 e088f964 'd') m d775 at 0 mt 1764703664 l 4096 t 0 d 0 ext )
I1202 19:27:46.467706   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 0 newfid 1 
I1202 19:27:46.467743   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rwalk tag 0 
I1202 19:27:46.467860   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Topen tag 0 fid 1 mode 0
I1202 19:27:46.467911   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Ropen tag 0 qid (15c38d3 e088f964 'd') iounit 0
I1202 19:27:46.468054   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 0
I1202 19:27:46.468112   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('001' 'jenkins' 'jenkins' '' q (15c38d3 e088f964 'd') m d775 at 0 mt 1764703664 l 4096 t 0 d 0 ext )
I1202 19:27:46.468276   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 1 offset 0 count 262120
I1202 19:27:46.468424   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 258
I1202 19:27:46.468565   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 1 offset 258 count 261862
I1202 19:27:46.468595   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 0
I1202 19:27:46.468713   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 1 offset 258 count 262120
I1202 19:27:46.468741   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 0
I1202 19:27:46.468887   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I1202 19:27:46.468919   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rwalk tag 0 (15c38d4 e088f964 '') 
I1202 19:27:46.469032   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:46.469078   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (15c38d4 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:46.469210   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:46.469239   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('created-by-test' 'jenkins' 'jenkins' '' q (15c38d4 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:46.469396   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 2
I1202 19:27:46.469416   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:46.469568   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 0 newfid 2 0:'test-1764703664487000285' 
I1202 19:27:46.469599   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rwalk tag 0 (15c38d6 e088f964 '') 
I1202 19:27:46.469709   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:46.469737   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('test-1764703664487000285' 'jenkins' 'jenkins' '' q (15c38d6 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:46.469864   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:46.469902   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('test-1764703664487000285' 'jenkins' 'jenkins' '' q (15c38d6 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:46.470013   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 2
I1202 19:27:46.470036   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:46.470216   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I1202 19:27:46.470264   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rwalk tag 0 (15c38d5 e088f964 '') 
I1202 19:27:46.470414   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:46.470460   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (15c38d5 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:46.470601   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tstat tag 0 fid 2
I1202 19:27:46.470635   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' 'jenkins' '' q (15c38d5 e088f964 '') m 644 at 0 mt 1764703664 l 24 t 0 d 0 ext )
I1202 19:27:46.470796   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 2
I1202 19:27:46.470816   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:46.471085   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tread tag 0 fid 1 offset 258 count 262120
I1202 19:27:46.471141   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rread tag 0 count 0
I1202 19:27:46.471314   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 1
I1202 19:27:46.471343   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:46.472816   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Twalk tag 0 fid 0 newfid 1 0:'pod-dates' 
I1202 19:27:46.472901   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rerror tag 0 ename 'file not found' ecode 0
I1202 19:27:46.734621   70121 main.go:127] stdlog: srv_conn.go:133 >>> 192.168.49.2:59912 Tclunk tag 0 fid 0
I1202 19:27:46.734691   70121 main.go:127] stdlog: srv_conn.go:190 <<< 192.168.49.2:59912 Rclunk tag 0
I1202 19:27:46.735755   70121 main.go:127] stdlog: ufs.go:147 disconnected
I1202 19:27:46.756436   70121 out.go:179] * Unmounting /mount-9p ...
I1202 19:27:46.759268   70121 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f -l /mount-9p || echo "
I1202 19:27:46.766168   70121 mount.go:180] unmount for /mount-9p ran successfully
I1202 19:27:46.766289   70121 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/.mount-process: {Name:mka65fc20e542da730c67c449a7d60eb4b207713 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I1202 19:27:46.769474   70121 out.go:203] 
W1202 19:27:46.772449   70121 out.go:285] X Exiting due to MK_INTERRUPTED: Received terminated signal
X Exiting due to MK_INTERRUPTED: Received terminated signal
I1202 19:27:46.775307   70121 out.go:203] 
--- FAIL: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/any-port (2.37s)

                                                
                                    
x
+
TestKubernetesUpgrade (792.37s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-685093 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:222: (dbg) Done: out/minikube-linux-arm64 start -p kubernetes-upgrade-685093 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (37.949768016s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-linux-arm64 stop -p kubernetes-upgrade-685093
version_upgrade_test.go:227: (dbg) Done: out/minikube-linux-arm64 stop -p kubernetes-upgrade-685093: (2.285841988s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-685093 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-linux-arm64 -p kubernetes-upgrade-685093 status --format={{.Host}}: exit status 7 (100.343042ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-linux-arm64 start -p kubernetes-upgrade-685093 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p kubernetes-upgrade-685093 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: exit status 109 (12m27.425583602s)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-685093] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22021
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	* Starting "kubernetes-upgrade-685093" primary control-plane node in "kubernetes-upgrade-685093" cluster
	* Pulling base image v0.0.48-1764169655-21974 ...
	* Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1202 19:59:45.341070  202120 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:59:45.341314  202120 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:59:45.341342  202120 out.go:374] Setting ErrFile to fd 2...
	I1202 19:59:45.341362  202120 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:59:45.341697  202120 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:59:45.342188  202120 out.go:368] Setting JSON to false
	I1202 19:59:45.343222  202120 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":6122,"bootTime":1764699464,"procs":189,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:59:45.343327  202120 start.go:143] virtualization:  
	I1202 19:59:45.346611  202120 out.go:179] * [kubernetes-upgrade-685093] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:59:45.350891  202120 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:59:45.351175  202120 notify.go:221] Checking for updates...
	I1202 19:59:45.357381  202120 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:59:45.360400  202120 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:59:45.363340  202120 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:59:45.366276  202120 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:59:45.369122  202120 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:59:45.372408  202120 config.go:182] Loaded profile config "kubernetes-upgrade-685093": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.28.0
	I1202 19:59:45.373013  202120 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:59:45.406168  202120 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:59:45.406331  202120 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:59:45.465561  202120 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:2 ContainersRunning:1 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:44 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:59:45.455729106 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:59:45.465675  202120 docker.go:319] overlay module found
	I1202 19:59:45.468948  202120 out.go:179] * Using the docker driver based on existing profile
	I1202 19:59:45.471863  202120 start.go:309] selected driver: docker
	I1202 19:59:45.471884  202120 start.go:927] validating driver "docker" against &{Name:kubernetes-upgrade-685093 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:kubernetes-upgrade-685093 Namespace:default APIServerHA
VIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Cu
stomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:59:45.471996  202120 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:59:45.472853  202120 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:59:45.532410  202120 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:2 ContainersRunning:1 ContainersPaused:0 ContainersStopped:1 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:44 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:59:45.522381034 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:59:45.532738  202120 cni.go:84] Creating CNI manager for ""
	I1202 19:59:45.532806  202120 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 19:59:45.532850  202120 start.go:353] cluster config:
	{Name:kubernetes-upgrade-685093 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-685093 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain
:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: Stat
icIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:59:45.536071  202120 out.go:179] * Starting "kubernetes-upgrade-685093" primary control-plane node in "kubernetes-upgrade-685093" cluster
	I1202 19:59:45.538937  202120 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 19:59:45.541850  202120 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 19:59:45.544761  202120 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:59:45.544830  202120 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 19:59:45.565445  202120 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 19:59:45.565466  202120 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	W1202 19:59:45.607665  202120 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	W1202 19:59:45.927218  202120 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
	I1202 19:59:45.927458  202120 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/config.json ...
	I1202 19:59:45.927504  202120 cache.go:107] acquiring lock: {Name:mkb3ffc95e4b7ac3756206049d851bf516a8abb7 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:59:45.927595  202120 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I1202 19:59:45.927611  202120 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5" took 123.112µs
	I1202 19:59:45.927628  202120 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I1202 19:59:45.927641  202120 cache.go:107] acquiring lock: {Name:mkfa39bba55c97fa80e441f8dcbaf6dc6a2ab6fc Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:59:45.927677  202120 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 exists
	I1202 19:59:45.927686  202120 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0" took 47.475µs
	I1202 19:59:45.927693  202120 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 succeeded
	I1202 19:59:45.927703  202120 cache.go:107] acquiring lock: {Name:mk7e3720bc30e96a70479f1acc707ef52791d566 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:59:45.927737  202120 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 exists
	I1202 19:59:45.927746  202120 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0" took 43.964µs
	I1202 19:59:45.927752  202120 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 succeeded
	I1202 19:59:45.927767  202120 cache.go:107] acquiring lock: {Name:mk87fcb81abcb9216a37cb770c1db1797c0a7f91 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:59:45.927795  202120 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 exists
	I1202 19:59:45.927805  202120 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0" took 38.655µs
	I1202 19:59:45.927811  202120 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 succeeded
	I1202 19:59:45.927819  202120 cache.go:107] acquiring lock: {Name:mkb0da8840651a370490ea2b46213e13fc0d5dac Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:59:45.927848  202120 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 exists
	I1202 19:59:45.927857  202120 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.35.0-beta.0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0" took 38.803µs
	I1202 19:59:45.927863  202120 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.35.0-beta.0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 succeeded
	I1202 19:59:45.927872  202120 cache.go:107] acquiring lock: {Name:mk9eec99a3e8e54b076a2ce506d08ceb8a7f49cb Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:59:45.927910  202120 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 exists
	I1202 19:59:45.927918  202120 cache.go:96] cache image "registry.k8s.io/pause:3.10.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1" took 47.181µs
	I1202 19:59:45.927924  202120 cache.go:80] save to tar file registry.k8s.io/pause:3.10.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 succeeded
	I1202 19:59:45.927933  202120 cache.go:107] acquiring lock: {Name:mk280b51a6d3bfe0cb60ae7355309f1bf1f99e1d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:59:45.927963  202120 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 exists
	I1202 19:59:45.927972  202120 cache.go:96] cache image "registry.k8s.io/etcd:3.6.5-0" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0" took 39.205µs
	I1202 19:59:45.927977  202120 cache.go:80] save to tar file registry.k8s.io/etcd:3.6.5-0 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 succeeded
	I1202 19:59:45.928001  202120 cache.go:107] acquiring lock: {Name:mkbf2c8ea9fae755e8e7ae1c483527f313757bae Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:59:45.928032  202120 cache.go:243] Successfully downloaded all kic artifacts
	I1202 19:59:45.928086  202120 start.go:360] acquireMachinesLock for kubernetes-upgrade-685093: {Name:mkb4321afc2bc9123acf5847221bb8f803362553 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 19:59:45.928140  202120 start.go:364] duration metric: took 30.491µs to acquireMachinesLock for "kubernetes-upgrade-685093"
	I1202 19:59:45.928155  202120 start.go:96] Skipping create...Using existing machine configuration
	I1202 19:59:45.928168  202120 fix.go:54] fixHost starting: 
	I1202 19:59:45.928036  202120 cache.go:115] /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 exists
	I1202 19:59:45.928190  202120 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.13.1" -> "/home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1" took 188.697µs
	I1202 19:59:45.928198  202120 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.13.1 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 succeeded
	I1202 19:59:45.928209  202120 cache.go:87] Successfully saved all images to host disk.
	I1202 19:59:45.928461  202120 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-685093 --format={{.State.Status}}
	I1202 19:59:45.945882  202120 fix.go:112] recreateIfNeeded on kubernetes-upgrade-685093: state=Stopped err=<nil>
	W1202 19:59:45.945922  202120 fix.go:138] unexpected machine state, will restart: <nil>
	I1202 19:59:45.949293  202120 out.go:252] * Restarting existing docker container for "kubernetes-upgrade-685093" ...
	I1202 19:59:45.949394  202120 cli_runner.go:164] Run: docker start kubernetes-upgrade-685093
	I1202 19:59:46.244072  202120 cli_runner.go:164] Run: docker container inspect kubernetes-upgrade-685093 --format={{.State.Status}}
	I1202 19:59:46.266023  202120 kic.go:430] container "kubernetes-upgrade-685093" state is running.
	I1202 19:59:46.266679  202120 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-685093
	I1202 19:59:46.288744  202120 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/config.json ...
	I1202 19:59:46.289314  202120 machine.go:94] provisionDockerMachine start ...
	I1202 19:59:46.289491  202120 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-685093
	I1202 19:59:46.314396  202120 main.go:143] libmachine: Using SSH client type: native
	I1202 19:59:46.315099  202120 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33015 <nil> <nil>}
	I1202 19:59:46.315116  202120 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 19:59:46.316270  202120 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1202 19:59:49.468868  202120 main.go:143] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-685093
	
	I1202 19:59:49.468892  202120 ubuntu.go:182] provisioning hostname "kubernetes-upgrade-685093"
	I1202 19:59:49.468966  202120 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-685093
	I1202 19:59:49.487465  202120 main.go:143] libmachine: Using SSH client type: native
	I1202 19:59:49.487792  202120 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33015 <nil> <nil>}
	I1202 19:59:49.487811  202120 main.go:143] libmachine: About to run SSH command:
	sudo hostname kubernetes-upgrade-685093 && echo "kubernetes-upgrade-685093" | sudo tee /etc/hostname
	I1202 19:59:49.646385  202120 main.go:143] libmachine: SSH cmd err, output: <nil>: kubernetes-upgrade-685093
	
	I1202 19:59:49.646464  202120 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-685093
	I1202 19:59:49.670132  202120 main.go:143] libmachine: Using SSH client type: native
	I1202 19:59:49.670434  202120 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33015 <nil> <nil>}
	I1202 19:59:49.670455  202120 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\skubernetes-upgrade-685093' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 kubernetes-upgrade-685093/g' /etc/hosts;
				else 
					echo '127.0.1.1 kubernetes-upgrade-685093' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 19:59:49.824787  202120 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 19:59:49.824812  202120 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 19:59:49.824841  202120 ubuntu.go:190] setting up certificates
	I1202 19:59:49.824850  202120 provision.go:84] configureAuth start
	I1202 19:59:49.824931  202120 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-685093
	I1202 19:59:49.842859  202120 provision.go:143] copyHostCerts
	I1202 19:59:49.842933  202120 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 19:59:49.842956  202120 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 19:59:49.843041  202120 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 19:59:49.843148  202120 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 19:59:49.843159  202120 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 19:59:49.843187  202120 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 19:59:49.843253  202120 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 19:59:49.843263  202120 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 19:59:49.843287  202120 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 19:59:49.843350  202120 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.kubernetes-upgrade-685093 san=[127.0.0.1 192.168.76.2 kubernetes-upgrade-685093 localhost minikube]
	I1202 19:59:50.180041  202120 provision.go:177] copyRemoteCerts
	I1202 19:59:50.180113  202120 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 19:59:50.180166  202120 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-685093
	I1202 19:59:50.198141  202120 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33015 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/kubernetes-upgrade-685093/id_rsa Username:docker}
	I1202 19:59:50.304080  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 19:59:50.322656  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1241 bytes)
	I1202 19:59:50.340816  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I1202 19:59:50.359456  202120 provision.go:87] duration metric: took 534.5811ms to configureAuth
	I1202 19:59:50.359487  202120 ubuntu.go:206] setting minikube options for container-runtime
	I1202 19:59:50.359683  202120 config.go:182] Loaded profile config "kubernetes-upgrade-685093": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:59:50.359694  202120 machine.go:97] duration metric: took 4.070354872s to provisionDockerMachine
	I1202 19:59:50.359701  202120 start.go:293] postStartSetup for "kubernetes-upgrade-685093" (driver="docker")
	I1202 19:59:50.359713  202120 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 19:59:50.359772  202120 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 19:59:50.359823  202120 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-685093
	I1202 19:59:50.377208  202120 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33015 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/kubernetes-upgrade-685093/id_rsa Username:docker}
	I1202 19:59:50.484358  202120 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 19:59:50.488101  202120 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 19:59:50.488134  202120 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 19:59:50.488152  202120 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 19:59:50.488218  202120 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 19:59:50.488300  202120 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 19:59:50.488456  202120 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1202 19:59:50.496258  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 19:59:50.514185  202120 start.go:296] duration metric: took 154.468523ms for postStartSetup
	I1202 19:59:50.514279  202120 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:59:50.514342  202120 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-685093
	I1202 19:59:50.531670  202120 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33015 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/kubernetes-upgrade-685093/id_rsa Username:docker}
	I1202 19:59:50.634246  202120 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 19:59:50.639297  202120 fix.go:56] duration metric: took 4.7111223s for fixHost
	I1202 19:59:50.639320  202120 start.go:83] releasing machines lock for "kubernetes-upgrade-685093", held for 4.711172302s
	I1202 19:59:50.639388  202120 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" kubernetes-upgrade-685093
	I1202 19:59:50.662711  202120 ssh_runner.go:195] Run: cat /version.json
	I1202 19:59:50.662768  202120 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-685093
	I1202 19:59:50.662944  202120 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 19:59:50.663000  202120 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" kubernetes-upgrade-685093
	I1202 19:59:50.690657  202120 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33015 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/kubernetes-upgrade-685093/id_rsa Username:docker}
	I1202 19:59:50.700673  202120 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33015 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/kubernetes-upgrade-685093/id_rsa Username:docker}
	I1202 19:59:50.792121  202120 ssh_runner.go:195] Run: systemctl --version
	I1202 19:59:50.907888  202120 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1202 19:59:50.912535  202120 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 19:59:50.912613  202120 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 19:59:50.923476  202120 cni.go:259] no active bridge cni configs found in "/etc/cni/net.d" - nothing to disable
	I1202 19:59:50.923514  202120 start.go:496] detecting cgroup driver to use...
	I1202 19:59:50.923547  202120 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 19:59:50.923609  202120 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 19:59:50.941343  202120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 19:59:50.954608  202120 docker.go:218] disabling cri-docker service (if available) ...
	I1202 19:59:50.954681  202120 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 19:59:50.970096  202120 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 19:59:50.983370  202120 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 19:59:51.107547  202120 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 19:59:51.224112  202120 docker.go:234] disabling docker service ...
	I1202 19:59:51.224187  202120 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 19:59:51.239797  202120 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 19:59:51.254185  202120 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 19:59:51.362407  202120 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 19:59:51.475996  202120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 19:59:51.490520  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 19:59:51.505131  202120 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 19:59:51.515642  202120 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 19:59:51.525001  202120 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 19:59:51.525125  202120 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 19:59:51.533994  202120 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:59:51.542965  202120 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 19:59:51.551990  202120 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 19:59:51.560789  202120 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 19:59:51.568675  202120 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 19:59:51.577662  202120 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 19:59:51.586515  202120 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 19:59:51.595722  202120 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 19:59:51.603566  202120 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 19:59:51.611073  202120 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 19:59:51.723993  202120 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 19:59:51.869175  202120 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 19:59:51.869324  202120 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 19:59:51.873224  202120 start.go:564] Will wait 60s for crictl version
	I1202 19:59:51.873290  202120 ssh_runner.go:195] Run: which crictl
	I1202 19:59:51.877293  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 19:59:51.914123  202120 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 19:59:51.914192  202120 ssh_runner.go:195] Run: containerd --version
	I1202 19:59:51.935055  202120 ssh_runner.go:195] Run: containerd --version
	I1202 19:59:51.962495  202120 out.go:179] * Preparing Kubernetes v1.35.0-beta.0 on containerd 2.1.5 ...
	I1202 19:59:51.965512  202120 cli_runner.go:164] Run: docker network inspect kubernetes-upgrade-685093 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 19:59:51.989353  202120 ssh_runner.go:195] Run: grep 192.168.76.1	host.minikube.internal$ /etc/hosts
	I1202 19:59:51.993196  202120 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.76.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1202 19:59:52.003388  202120 kubeadm.go:884] updating cluster {Name:kubernetes-upgrade-685093 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-685093 Namespace:default APIServerHAVIP: APISe
rverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false Custo
mQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 19:59:52.003498  202120 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
	I1202 19:59:52.003556  202120 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 19:59:52.029663  202120 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.35.0-beta.0". assuming images are not preloaded.
	I1202 19:59:52.029694  202120 cache_images.go:90] LoadCachedImages start: [registry.k8s.io/kube-apiserver:v1.35.0-beta.0 registry.k8s.io/kube-controller-manager:v1.35.0-beta.0 registry.k8s.io/kube-scheduler:v1.35.0-beta.0 registry.k8s.io/kube-proxy:v1.35.0-beta.0 registry.k8s.io/pause:3.10.1 registry.k8s.io/etcd:3.6.5-0 registry.k8s.io/coredns/coredns:v1.13.1 gcr.io/k8s-minikube/storage-provisioner:v5]
	I1202 19:59:52.029755  202120 image.go:138] retrieving image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 19:59:52.030011  202120 image.go:138] retrieving image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 19:59:52.030143  202120 image.go:138] retrieving image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 19:59:52.030231  202120 image.go:138] retrieving image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 19:59:52.030318  202120 image.go:138] retrieving image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 19:59:52.030406  202120 image.go:138] retrieving image: registry.k8s.io/pause:3.10.1
	I1202 19:59:52.030508  202120 image.go:138] retrieving image: registry.k8s.io/etcd:3.6.5-0
	I1202 19:59:52.030594  202120 image.go:138] retrieving image: registry.k8s.io/coredns/coredns:v1.13.1
	I1202 19:59:52.032021  202120 image.go:181] daemon lookup for registry.k8s.io/coredns/coredns:v1.13.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.13.1
	I1202 19:59:52.032670  202120 image.go:181] daemon lookup for registry.k8s.io/pause:3.10.1: Error response from daemon: No such image: registry.k8s.io/pause:3.10.1
	I1202 19:59:52.032855  202120 image.go:181] daemon lookup for registry.k8s.io/kube-apiserver:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 19:59:52.032986  202120 image.go:181] daemon lookup for registry.k8s.io/kube-controller-manager:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 19:59:52.033095  202120 image.go:181] daemon lookup for gcr.io/k8s-minikube/storage-provisioner:v5: Error response from daemon: No such image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 19:59:52.033403  202120 image.go:181] daemon lookup for registry.k8s.io/kube-scheduler:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 19:59:52.033655  202120 image.go:181] daemon lookup for registry.k8s.io/kube-proxy:v1.35.0-beta.0: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 19:59:52.034296  202120 image.go:181] daemon lookup for registry.k8s.io/etcd:3.6.5-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.6.5-0
	I1202 19:59:52.411514  202120 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-proxy:v1.35.0-beta.0" and sha "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904"
	I1202 19:59:52.411611  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 19:59:52.423799  202120 containerd.go:267] Checking existence of image with name "registry.k8s.io/pause:3.10.1" and sha "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd"
	I1202 19:59:52.423877  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/pause:3.10.1
	I1202 19:59:52.424075  202120 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" and sha "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b"
	I1202 19:59:52.424154  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 19:59:52.431127  202120 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" and sha "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4"
	I1202 19:59:52.431243  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 19:59:52.463521  202120 cache_images.go:118] "registry.k8s.io/kube-proxy:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-proxy:v1.35.0-beta.0" does not exist at hash "404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904" in container runtime
	I1202 19:59:52.463564  202120 cri.go:218] Removing image: registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 19:59:52.463625  202120 ssh_runner.go:195] Run: which crictl
	I1202 19:59:52.472990  202120 cache_images.go:118] "registry.k8s.io/pause:3.10.1" needs transfer: "registry.k8s.io/pause:3.10.1" does not exist at hash "d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd" in container runtime
	I1202 19:59:52.473051  202120 cri.go:218] Removing image: registry.k8s.io/pause:3.10.1
	I1202 19:59:52.473102  202120 ssh_runner.go:195] Run: which crictl
	I1202 19:59:52.475638  202120 cache_images.go:118] "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-scheduler:v1.35.0-beta.0" does not exist at hash "16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b" in container runtime
	I1202 19:59:52.475680  202120 cri.go:218] Removing image: registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 19:59:52.475739  202120 ssh_runner.go:195] Run: which crictl
	I1202 19:59:52.484818  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 19:59:52.484907  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1202 19:59:52.484711  202120 cache_images.go:118] "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-apiserver:v1.35.0-beta.0" does not exist at hash "ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4" in container runtime
	I1202 19:59:52.484977  202120 cri.go:218] Removing image: registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 19:59:52.485008  202120 ssh_runner.go:195] Run: which crictl
	I1202 19:59:52.485290  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 19:59:52.486708  202120 containerd.go:267] Checking existence of image with name "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" and sha "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be"
	I1202 19:59:52.486801  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 19:59:52.488133  202120 containerd.go:267] Checking existence of image with name "registry.k8s.io/coredns/coredns:v1.13.1" and sha "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf"
	I1202 19:59:52.488230  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/coredns/coredns:v1.13.1
	I1202 19:59:52.488552  202120 containerd.go:267] Checking existence of image with name "registry.k8s.io/etcd:3.6.5-0" and sha "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42"
	I1202 19:59:52.488620  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==registry.k8s.io/etcd:3.6.5-0
	I1202 19:59:52.570585  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 19:59:52.570682  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 19:59:52.570748  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1202 19:59:52.570821  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 19:59:52.570877  202120 cache_images.go:118] "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" needs transfer: "registry.k8s.io/kube-controller-manager:v1.35.0-beta.0" does not exist at hash "68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be" in container runtime
	I1202 19:59:52.570901  202120 cri.go:218] Removing image: registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 19:59:52.570925  202120 ssh_runner.go:195] Run: which crictl
	I1202 19:59:52.570982  202120 cache_images.go:118] "registry.k8s.io/etcd:3.6.5-0" needs transfer: "registry.k8s.io/etcd:3.6.5-0" does not exist at hash "2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42" in container runtime
	I1202 19:59:52.570995  202120 cri.go:218] Removing image: registry.k8s.io/etcd:3.6.5-0
	I1202 19:59:52.571014  202120 ssh_runner.go:195] Run: which crictl
	I1202 19:59:52.571049  202120 cache_images.go:118] "registry.k8s.io/coredns/coredns:v1.13.1" needs transfer: "registry.k8s.io/coredns/coredns:v1.13.1" does not exist at hash "e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf" in container runtime
	I1202 19:59:52.571061  202120 cri.go:218] Removing image: registry.k8s.io/coredns/coredns:v1.13.1
	I1202 19:59:52.571080  202120 ssh_runner.go:195] Run: which crictl
	I1202 19:59:52.642492  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/pause:3.10.1
	I1202 19:59:52.642575  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 19:59:52.642623  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 19:59:52.642667  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-scheduler:v1.35.0-beta.0
	I1202 19:59:52.642583  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-proxy:v1.35.0-beta.0
	I1202 19:59:52.642740  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1202 19:59:52.642718  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1202 19:59:52.748392  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1202 19:59:52.748425  202120 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1
	I1202 19:59:52.748526  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 19:59:52.748602  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1
	I1202 19:59:52.748637  202120 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0
	I1202 19:59:52.748614  202120 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0
	I1202 19:59:52.748701  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1202 19:59:52.748767  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1202 19:59:52.748487  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-apiserver:v1.35.0-beta.0
	I1202 19:59:52.748771  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1202 19:59:52.823515  202120 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0': No such file or directory
	I1202 19:59:52.823574  202120 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0
	I1202 19:59:52.823614  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0 (15401984 bytes)
	I1202 19:59:52.823689  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1202 19:59:52.823766  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/coredns/coredns:v1.13.1
	I1202 19:59:52.823798  202120 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-proxy_v1.35.0-beta.0': No such file or directory
	I1202 19:59:52.823817  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0 (22432256 bytes)
	I1202 19:59:52.823884  202120 ssh_runner.go:352] existence check for /var/lib/minikube/images/pause_3.10.1: stat -c "%s %y" /var/lib/minikube/images/pause_3.10.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/pause_3.10.1': No such file or directory
	I1202 19:59:52.824043  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 --> /var/lib/minikube/images/pause_3.10.1 (268288 bytes)
	I1202 19:59:52.823972  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
	I1202 19:59:52.824017  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi registry.k8s.io/etcd:3.6.5-0
	I1202 19:59:52.841805  202120 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0': No such file or directory
	I1202 19:59:52.841838  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0 (24689152 bytes)
	I1202 19:59:52.925856  202120 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0
	I1202 19:59:52.925969  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1202 19:59:52.926025  202120 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1
	I1202 19:59:52.926076  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1
	I1202 19:59:52.926259  202120 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0
	I1202 19:59:52.926318  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/etcd_3.6.5-0
	I1202 19:59:52.939695  202120 containerd.go:285] Loading image: /var/lib/minikube/images/pause_3.10.1
	I1202 19:59:52.939783  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/pause_3.10.1
	I1202 19:59:53.022414  202120 ssh_runner.go:352] existence check for /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: stat -c "%s %y" /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0': No such file or directory
	I1202 19:59:53.022458  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 --> /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0 (20672000 bytes)
	I1202 19:59:53.022519  202120 ssh_runner.go:352] existence check for /var/lib/minikube/images/coredns_v1.13.1: stat -c "%s %y" /var/lib/minikube/images/coredns_v1.13.1: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/coredns_v1.13.1': No such file or directory
	I1202 19:59:53.022544  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 --> /var/lib/minikube/images/coredns_v1.13.1 (21178368 bytes)
	I1202 19:59:53.022581  202120 ssh_runner.go:352] existence check for /var/lib/minikube/images/etcd_3.6.5-0: stat -c "%s %y" /var/lib/minikube/images/etcd_3.6.5-0: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/etcd_3.6.5-0': No such file or directory
	I1202 19:59:53.022597  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 --> /var/lib/minikube/images/etcd_3.6.5-0 (21148160 bytes)
	I1202 19:59:53.200875  202120 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/pause_3.10.1 from cache
	W1202 19:59:53.411179  202120 image.go:286] image gcr.io/k8s-minikube/storage-provisioner:v5 arch mismatch: want arm64 got amd64. fixing
	I1202 19:59:53.411388  202120 containerd.go:267] Checking existence of image with name "gcr.io/k8s-minikube/storage-provisioner:v5" and sha "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51"
	I1202 19:59:53.411463  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images ls name==gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 19:59:53.417079  202120 containerd.go:285] Loading image: /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1202 19:59:53.417174  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0
	I1202 19:59:53.494578  202120 cache_images.go:118] "gcr.io/k8s-minikube/storage-provisioner:v5" needs transfer: "gcr.io/k8s-minikube/storage-provisioner:v5" does not exist at hash "66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51" in container runtime
	I1202 19:59:53.497405  202120 cri.go:218] Removing image: gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 19:59:53.498452  202120 ssh_runner.go:195] Run: which crictl
	I1202 19:59:54.742374  202120 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-scheduler_v1.35.0-beta.0: (1.32517545s)
	I1202 19:59:54.742400  202120 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-scheduler_v1.35.0-beta.0 from cache
	I1202 19:59:54.742416  202120 containerd.go:285] Loading image: /var/lib/minikube/images/etcd_3.6.5-0
	I1202 19:59:54.742489  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.6.5-0
	I1202 19:59:54.742588  202120 ssh_runner.go:235] Completed: which crictl: (1.244118221s)
	I1202 19:59:54.742619  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5
	I1202 19:59:56.248613  202120 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/etcd_3.6.5-0: (1.506096179s)
	I1202 19:59:56.248755  202120 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/etcd_3.6.5-0 from cache
	I1202 19:59:56.248780  202120 containerd.go:285] Loading image: /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1202 19:59:56.248716  202120 ssh_runner.go:235] Completed: sudo /usr/local/bin/crictl rmi gcr.io/k8s-minikube/storage-provisioner:v5: (1.506065377s)
	I1202 19:59:56.248833  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-controller-manager_v1.35.0-beta.0
	I1202 19:59:56.248856  202120 cache_images.go:291] Loading image from: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5
	I1202 19:59:56.248974  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5
	I1202 19:59:57.214628  202120 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-controller-manager_v1.35.0-beta.0 from cache
	I1202 19:59:57.214663  202120 containerd.go:285] Loading image: /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1202 19:59:57.214714  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0
	I1202 19:59:57.214806  202120 ssh_runner.go:352] existence check for /var/lib/minikube/images/storage-provisioner_v5: stat -c "%s %y" /var/lib/minikube/images/storage-provisioner_v5: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/images/storage-provisioner_v5': No such file or directory
	I1202 19:59:57.214829  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 --> /var/lib/minikube/images/storage-provisioner_v5 (8035840 bytes)
	I1202 19:59:58.297883  202120 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-proxy_v1.35.0-beta.0: (1.083144109s)
	I1202 19:59:58.297955  202120 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-proxy_v1.35.0-beta.0 from cache
	I1202 19:59:58.298004  202120 containerd.go:285] Loading image: /var/lib/minikube/images/coredns_v1.13.1
	I1202 19:59:58.298085  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1
	I1202 19:59:59.598413  202120 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/coredns_v1.13.1: (1.300285429s)
	I1202 19:59:59.598443  202120 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/coredns/coredns_v1.13.1 from cache
	I1202 19:59:59.598472  202120 containerd.go:285] Loading image: /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1202 19:59:59.598522  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0
	I1202 20:00:02.048680  202120 ssh_runner.go:235] Completed: sudo ctr -n=k8s.io images import /var/lib/minikube/images/kube-apiserver_v1.35.0-beta.0: (2.450125614s)
	I1202 20:00:02.048708  202120 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/registry.k8s.io/kube-apiserver_v1.35.0-beta.0 from cache
	I1202 20:00:02.048728  202120 containerd.go:285] Loading image: /var/lib/minikube/images/storage-provisioner_v5
	I1202 20:00:02.048847  202120 ssh_runner.go:195] Run: sudo ctr -n=k8s.io images import /var/lib/minikube/images/storage-provisioner_v5
	I1202 20:00:02.682457  202120 cache_images.go:323] Transferred and loaded /home/jenkins/minikube-integration/22021-2487/.minikube/cache/images/arm64/gcr.io/k8s-minikube/storage-provisioner_v5 from cache
	I1202 20:00:02.682494  202120 cache_images.go:125] Successfully loaded all cached images
	I1202 20:00:02.682501  202120 cache_images.go:94] duration metric: took 10.652792005s to LoadCachedImages
	I1202 20:00:02.682514  202120 kubeadm.go:935] updating node { 192.168.76.2 8443 v1.35.0-beta.0 containerd true true} ...
	I1202 20:00:02.682623  202120 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=kubernetes-upgrade-685093 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.76.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-685093 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 20:00:02.682741  202120 ssh_runner.go:195] Run: sudo crictl info
	I1202 20:00:02.713114  202120 cni.go:84] Creating CNI manager for ""
	I1202 20:00:02.713144  202120 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 20:00:02.713165  202120 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 20:00:02.713190  202120 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.76.2 APIServerPort:8443 KubernetesVersion:v1.35.0-beta.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:kubernetes-upgrade-685093 NodeName:kubernetes-upgrade-685093 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.76.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.76.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/
certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 20:00:02.713320  202120 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.76.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "kubernetes-upgrade-685093"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.76.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.35.0-beta.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 20:00:02.713407  202120 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 20:00:02.723341  202120 binaries.go:54] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.35.0-beta.0: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.35.0-beta.0': No such file or directory
	
	Initiating transfer...
	I1202 20:00:02.723446  202120 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.35.0-beta.0
	I1202 20:00:02.732599  202120 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubectl.sha256
	I1202 20:00:02.732710  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl
	I1202 20:00:02.732818  202120 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubelet.sha256
	I1202 20:00:02.732862  202120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 20:00:02.732956  202120 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.35.0-beta.0/bin/linux/arm64/kubeadm.sha256
	I1202 20:00:02.733015  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm
	I1202 20:00:02.752156  202120 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm': No such file or directory
	I1202 20:00:02.752201  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubeadm --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubeadm (68354232 bytes)
	I1202 20:00:02.752248  202120 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubectl': No such file or directory
	I1202 20:00:02.752282  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubectl --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl (55181496 bytes)
	I1202 20:00:02.752404  202120 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet
	I1202 20:00:02.759842  202120 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.35.0-beta.0/kubelet': No such file or directory
	I1202 20:00:02.759980  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/cache/linux/arm64/v1.35.0-beta.0/kubelet --> /var/lib/minikube/binaries/v1.35.0-beta.0/kubelet (54329636 bytes)
	I1202 20:00:03.945961  202120 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 20:00:03.956987  202120 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (336 bytes)
	I1202 20:00:03.974273  202120 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (359 bytes)
	I1202 20:00:03.990596  202120 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2245 bytes)
	I1202 20:00:04.008252  202120 ssh_runner.go:195] Run: grep 192.168.76.2	control-plane.minikube.internal$ /etc/hosts
	I1202 20:00:04.014484  202120 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.76.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1202 20:00:04.028731  202120 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 20:00:04.156579  202120 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 20:00:04.176444  202120 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093 for IP: 192.168.76.2
	I1202 20:00:04.176509  202120 certs.go:195] generating shared ca certs ...
	I1202 20:00:04.176544  202120 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:00:04.176749  202120 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 20:00:04.176821  202120 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 20:00:04.176866  202120 certs.go:257] generating profile certs ...
	I1202 20:00:04.176994  202120 certs.go:360] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/client.key
	I1202 20:00:04.177127  202120 certs.go:360] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/apiserver.key.cae7297e
	I1202 20:00:04.177209  202120 certs.go:360] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/proxy-client.key
	I1202 20:00:04.177394  202120 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 20:00:04.177461  202120 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 20:00:04.177485  202120 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 20:00:04.177546  202120 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 20:00:04.177604  202120 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 20:00:04.177669  202120 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 20:00:04.177744  202120 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 20:00:04.178368  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 20:00:04.218531  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 20:00:04.247917  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 20:00:04.267904  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 20:00:04.288964  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I1202 20:00:04.312473  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I1202 20:00:04.336370  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 20:00:04.357704  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I1202 20:00:04.381224  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 20:00:04.401167  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 20:00:04.422967  202120 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 20:00:04.445568  202120 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 20:00:04.460438  202120 ssh_runner.go:195] Run: openssl version
	I1202 20:00:04.467843  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 20:00:04.478457  202120 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 20:00:04.482732  202120 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 20:00:04.482891  202120 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 20:00:04.527412  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 20:00:04.536167  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 20:00:04.545894  202120 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 20:00:04.550213  202120 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 20:00:04.550329  202120 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 20:00:04.592303  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 20:00:04.600643  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 20:00:04.609546  202120 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 20:00:04.613665  202120 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 20:00:04.613727  202120 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 20:00:04.655849  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 20:00:04.664480  202120 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 20:00:04.669830  202120 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I1202 20:00:04.712316  202120 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I1202 20:00:04.755899  202120 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I1202 20:00:04.797682  202120 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I1202 20:00:04.840905  202120 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I1202 20:00:04.905455  202120 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I1202 20:00:04.951883  202120 kubeadm.go:401] StartCluster: {Name:kubernetes-upgrade-685093 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:kubernetes-upgrade-685093 Namespace:default APIServerHAVIP: APIServe
rName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.76.2 Port:8443 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQe
muFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 20:00:04.952006  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 20:00:04.952084  202120 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 20:00:04.985584  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:00:04.985606  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:00:04.985611  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:00:04.985614  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:00:04.985618  202120 cri.go:89] found id: ""
	I1202 20:00:04.985679  202120 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	W1202 20:00:05.021301  202120 kubeadm.go:408] unpause failed: list paused: runc: sudo runc --root /run/containerd/runc/k8s.io list -f json: Process exited with status 1
	stdout:
	
	stderr:
	time="2025-12-02T20:00:05Z" level=error msg="open /run/containerd/runc/k8s.io: no such file or directory"
	I1202 20:00:05.021458  202120 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 20:00:05.031799  202120 kubeadm.go:417] found existing configuration files, will attempt cluster restart
	I1202 20:00:05.031820  202120 kubeadm.go:598] restartPrimaryControlPlane start ...
	I1202 20:00:05.031875  202120 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I1202 20:00:05.041829  202120 kubeadm.go:131] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I1202 20:00:05.042393  202120 kubeconfig.go:47] verify endpoint returned: get endpoint: "kubernetes-upgrade-685093" does not appear in /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 20:00:05.042679  202120 kubeconfig.go:62] /home/jenkins/minikube-integration/22021-2487/kubeconfig needs updating (will repair): [kubeconfig missing "kubernetes-upgrade-685093" cluster setting kubeconfig missing "kubernetes-upgrade-685093" context setting]
	I1202 20:00:05.043172  202120 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:00:05.043898  202120 kapi.go:59] client config for kubernetes-upgrade-685093: &rest.Config{Host:"https://192.168.76.2:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/client.crt", KeyFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kubernetes-upgrade-685093/client.key", CAFile:"/home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CA
Data:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1fb33c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), WarningHandlerWithContext:rest.WarningHandlerWithContext(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I1202 20:00:05.044498  202120 envvar.go:172] "Feature gate default state" feature="ClientsAllowCBOR" enabled=false
	I1202 20:00:05.044529  202120 envvar.go:172] "Feature gate default state" feature="ClientsPreferCBOR" enabled=false
	I1202 20:00:05.044535  202120 envvar.go:172] "Feature gate default state" feature="InformerResourceVersion" enabled=false
	I1202 20:00:05.044544  202120 envvar.go:172] "Feature gate default state" feature="InOrderInformers" enabled=true
	I1202 20:00:05.044557  202120 envvar.go:172] "Feature gate default state" feature="WatchListClient" enabled=false
	I1202 20:00:05.044822  202120 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I1202 20:00:05.054971  202120 kubeadm.go:645] detected kubeadm config drift (will reconfigure cluster from new /var/tmp/minikube/kubeadm.yaml):
	-- stdout --
	--- /var/tmp/minikube/kubeadm.yaml	2025-12-02 19:59:22.285490681 +0000
	+++ /var/tmp/minikube/kubeadm.yaml.new	2025-12-02 20:00:04.002188077 +0000
	@@ -1,4 +1,4 @@
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: InitConfiguration
	 localAPIEndpoint:
	   advertiseAddress: 192.168.76.2
	@@ -14,31 +14,34 @@
	   criSocket: unix:///run/containerd/containerd.sock
	   name: "kubernetes-upgrade-685093"
	   kubeletExtraArgs:
	-    node-ip: 192.168.76.2
	+    - name: "node-ip"
	+      value: "192.168.76.2"
	   taints: []
	 ---
	-apiVersion: kubeadm.k8s.io/v1beta3
	+apiVersion: kubeadm.k8s.io/v1beta4
	 kind: ClusterConfiguration
	 apiServer:
	   certSANs: ["127.0.0.1", "localhost", "192.168.76.2"]
	   extraArgs:
	-    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	+    - name: "enable-admission-plugins"
	+      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	 controllerManager:
	   extraArgs:
	-    allocate-node-cidrs: "true"
	-    leader-elect: "false"
	+    - name: "allocate-node-cidrs"
	+      value: "true"
	+    - name: "leader-elect"
	+      value: "false"
	 scheduler:
	   extraArgs:
	-    leader-elect: "false"
	+    - name: "leader-elect"
	+      value: "false"
	 certificatesDir: /var/lib/minikube/certs
	 clusterName: mk
	 controlPlaneEndpoint: control-plane.minikube.internal:8443
	 etcd:
	   local:
	     dataDir: /var/lib/minikube/etcd
	-    extraArgs:
	-      proxy-refresh-interval: "70000"
	-kubernetesVersion: v1.28.0
	+kubernetesVersion: v1.35.0-beta.0
	 networking:
	   dnsDomain: cluster.local
	   podSubnet: "10.244.0.0/16"
	
	-- /stdout --
	I1202 20:00:05.054991  202120 kubeadm.go:1161] stopping kube-system containers ...
	I1202 20:00:05.055004  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name: Namespaces:[kube-system]}
	I1202 20:00:05.055103  202120 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 20:00:05.083336  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:00:05.083360  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:00:05.083372  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:00:05.083384  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:00:05.083389  202120 cri.go:89] found id: ""
	I1202 20:00:05.083395  202120 cri.go:252] Stopping containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:00:05.083475  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:00:05.087671  202120 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl stop --timeout=10 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53
	I1202 20:00:05.121631  202120 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I1202 20:00:05.137355  202120 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 20:00:05.145994  202120 kubeadm.go:158] found existing configuration files:
	-rw------- 1 root root 5639 Dec  2 19:59 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5652 Dec  2 19:59 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 2039 Dec  2 19:59 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5604 Dec  2 19:59 /etc/kubernetes/scheduler.conf
	
	I1202 20:00:05.146097  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1202 20:00:05.154688  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1202 20:00:05.163730  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1202 20:00:05.172417  202120 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 20:00:05.172493  202120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 20:00:05.181515  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1202 20:00:05.190709  202120 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I1202 20:00:05.190814  202120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 20:00:05.199316  202120 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 20:00:05.207861  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 20:00:05.260680  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 20:00:06.312029  202120 ssh_runner.go:235] Completed: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml": (1.051311731s)
	I1202 20:00:06.312102  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I1202 20:00:06.529042  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I1202 20:00:06.586620  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I1202 20:00:06.637014  202120 api_server.go:52] waiting for apiserver process to appear ...
	I1202 20:00:06.637124  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:07.137186  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:07.637336  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:08.137319  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:08.637913  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:09.137614  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:09.637404  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:10.138098  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:10.637315  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:11.137505  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:11.637576  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:12.137297  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:12.637286  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:13.137176  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:13.637374  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:14.137171  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:14.638210  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:15.137846  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:15.637848  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:16.137910  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:16.637311  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:17.137205  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:17.637917  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:18.138086  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:18.638033  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:19.137249  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:19.637207  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:20.137759  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:20.637594  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:21.137478  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:21.637432  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:22.137544  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:22.637382  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:23.138243  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:23.637550  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:24.137388  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:24.637293  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:25.137470  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:25.638107  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:26.138015  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:26.639098  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:27.137542  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:27.637595  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:28.137430  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:28.638098  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:29.137554  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:29.637975  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:30.138122  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:30.638147  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:31.137271  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:31.637160  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:32.137182  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:32.637393  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:33.137498  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:33.637226  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:34.137520  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:34.637174  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:35.137849  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:35.637233  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:36.137777  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:36.637390  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:37.137929  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:37.637188  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:38.138032  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:38.637277  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:39.138039  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:39.637254  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:40.137234  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:40.637250  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:41.138105  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:41.637860  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:42.137681  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:42.638034  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:43.137536  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:43.637222  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:44.137611  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:44.637912  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:45.137237  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:45.638106  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:46.137460  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:46.637152  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:47.137257  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:47.638178  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:48.138196  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:48.638213  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:49.137263  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:49.637789  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:50.137261  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:50.637271  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:51.137231  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:51.637615  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:52.137227  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:52.638097  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:53.137209  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:53.637791  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:54.137612  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:54.637221  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:55.137902  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:55.637570  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:56.137511  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:56.637266  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:57.137623  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:57.638100  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:58.137262  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:58.637214  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:59.137869  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:00:59.637821  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:00.137239  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:00.637160  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:01.138046  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:01.637620  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:02.137235  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:02.637296  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:03.137249  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:03.638189  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:04.138154  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:04.638115  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:05.137894  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:05.637850  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:06.137517  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:06.637792  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:06.637888  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:06.689623  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:06.689645  202120 cri.go:89] found id: ""
	I1202 20:01:06.689653  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:06.689712  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:06.700600  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:06.700680  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:06.781859  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:06.781945  202120 cri.go:89] found id: ""
	I1202 20:01:06.781972  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:06.782069  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:06.787431  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:06.787517  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:06.864964  202120 cri.go:89] found id: ""
	I1202 20:01:06.864990  202120 logs.go:282] 0 containers: []
	W1202 20:01:06.864999  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:06.865006  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:06.865086  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:06.923295  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:06.923332  202120 cri.go:89] found id: ""
	I1202 20:01:06.923340  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:06.923400  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:06.933219  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:06.933322  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:06.987425  202120 cri.go:89] found id: ""
	I1202 20:01:06.987446  202120 logs.go:282] 0 containers: []
	W1202 20:01:06.987454  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:06.987461  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:06.987522  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:07.034934  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:07.035016  202120 cri.go:89] found id: ""
	I1202 20:01:07.035038  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:07.035146  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:07.041026  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:07.041153  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:07.089691  202120 cri.go:89] found id: ""
	I1202 20:01:07.089713  202120 logs.go:282] 0 containers: []
	W1202 20:01:07.089722  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:07.089728  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:07.089787  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:07.132498  202120 cri.go:89] found id: ""
	I1202 20:01:07.132575  202120 logs.go:282] 0 containers: []
	W1202 20:01:07.132599  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:07.132630  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:07.132684  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:07.154406  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:07.154483  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:07.250485  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:07.250556  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:07.250591  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:07.296755  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:07.296840  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:07.355377  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:07.355456  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:07.405466  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:07.405541  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:07.459381  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:07.459466  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:07.525098  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:07.525179  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:07.603369  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:07.603449  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:10.179175  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:10.197795  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:10.197911  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:10.247666  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:10.247691  202120 cri.go:89] found id: ""
	I1202 20:01:10.247700  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:10.247756  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:10.253716  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:10.253795  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:10.301629  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:10.301651  202120 cri.go:89] found id: ""
	I1202 20:01:10.301659  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:10.301748  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:10.305778  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:10.305883  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:10.337722  202120 cri.go:89] found id: ""
	I1202 20:01:10.337755  202120 logs.go:282] 0 containers: []
	W1202 20:01:10.337765  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:10.337772  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:10.337875  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:10.374239  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:10.374313  202120 cri.go:89] found id: ""
	I1202 20:01:10.374336  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:10.374423  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:10.378900  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:10.379016  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:10.422702  202120 cri.go:89] found id: ""
	I1202 20:01:10.422769  202120 logs.go:282] 0 containers: []
	W1202 20:01:10.422793  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:10.422813  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:10.422889  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:10.469998  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:10.470062  202120 cri.go:89] found id: ""
	I1202 20:01:10.470085  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:10.470166  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:10.480128  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:10.480269  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:10.542657  202120 cri.go:89] found id: ""
	I1202 20:01:10.542739  202120 logs.go:282] 0 containers: []
	W1202 20:01:10.542764  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:10.542786  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:10.542876  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:10.585012  202120 cri.go:89] found id: ""
	I1202 20:01:10.585093  202120 logs.go:282] 0 containers: []
	W1202 20:01:10.585118  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:10.585147  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:10.585176  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:10.615246  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:10.615275  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:10.808541  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:10.808566  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:10.808582  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:10.887001  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:10.887033  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:10.938949  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:10.938984  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:11.004441  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:11.004531  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:11.081724  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:11.081808  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:11.139351  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:11.139385  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:11.189963  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:11.190000  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:13.741488  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:13.751812  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:13.751882  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:13.779583  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:13.779605  202120 cri.go:89] found id: ""
	I1202 20:01:13.779614  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:13.779673  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:13.783394  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:13.783480  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:13.810708  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:13.810728  202120 cri.go:89] found id: ""
	I1202 20:01:13.810736  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:13.810794  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:13.815587  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:13.815660  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:13.842020  202120 cri.go:89] found id: ""
	I1202 20:01:13.842049  202120 logs.go:282] 0 containers: []
	W1202 20:01:13.842060  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:13.842066  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:13.842124  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:13.870472  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:13.870493  202120 cri.go:89] found id: ""
	I1202 20:01:13.870502  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:13.870561  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:13.874389  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:13.874465  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:13.903079  202120 cri.go:89] found id: ""
	I1202 20:01:13.903101  202120 logs.go:282] 0 containers: []
	W1202 20:01:13.903110  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:13.903116  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:13.903179  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:13.930860  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:13.930880  202120 cri.go:89] found id: ""
	I1202 20:01:13.930888  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:13.930949  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:13.934787  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:13.934863  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:13.970760  202120 cri.go:89] found id: ""
	I1202 20:01:13.970783  202120 logs.go:282] 0 containers: []
	W1202 20:01:13.970791  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:13.970799  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:13.970858  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:14.006120  202120 cri.go:89] found id: ""
	I1202 20:01:14.006143  202120 logs.go:282] 0 containers: []
	W1202 20:01:14.006152  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:14.006172  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:14.006186  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:14.057476  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:14.057511  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:14.095181  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:14.095217  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:14.135086  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:14.135115  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:14.197862  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:14.197896  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:14.213377  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:14.213406  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:14.289270  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:14.289298  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:14.289311  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:14.341292  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:14.341327  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:14.394384  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:14.394419  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:16.948445  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:16.959096  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:16.959169  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:16.994026  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:16.994055  202120 cri.go:89] found id: ""
	I1202 20:01:16.994065  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:16.994128  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:17.000266  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:17.000376  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:17.053985  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:17.054007  202120 cri.go:89] found id: ""
	I1202 20:01:17.054015  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:17.054072  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:17.059729  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:17.059805  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:17.092431  202120 cri.go:89] found id: ""
	I1202 20:01:17.092453  202120 logs.go:282] 0 containers: []
	W1202 20:01:17.092462  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:17.092468  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:17.092549  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:17.151053  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:17.151078  202120 cri.go:89] found id: ""
	I1202 20:01:17.151086  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:17.151142  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:17.155071  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:17.155145  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:17.182766  202120 cri.go:89] found id: ""
	I1202 20:01:17.182839  202120 logs.go:282] 0 containers: []
	W1202 20:01:17.182862  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:17.182883  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:17.182975  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:17.210779  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:17.210799  202120 cri.go:89] found id: ""
	I1202 20:01:17.210807  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:17.210863  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:17.214631  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:17.214709  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:17.239924  202120 cri.go:89] found id: ""
	I1202 20:01:17.239951  202120 logs.go:282] 0 containers: []
	W1202 20:01:17.239960  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:17.239967  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:17.240035  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:17.264518  202120 cri.go:89] found id: ""
	I1202 20:01:17.264545  202120 logs.go:282] 0 containers: []
	W1202 20:01:17.264553  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:17.264567  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:17.264579  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:17.297867  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:17.297902  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:17.331524  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:17.331554  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:17.366386  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:17.366420  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:17.400270  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:17.400302  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:17.468487  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:17.468526  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:17.497194  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:17.497225  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:17.588427  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:17.588463  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:17.588477  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:17.626437  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:17.626506  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:20.167997  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:20.178517  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:20.178590  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:20.203974  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:20.203998  202120 cri.go:89] found id: ""
	I1202 20:01:20.204007  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:20.204066  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:20.207797  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:20.207868  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:20.234880  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:20.234901  202120 cri.go:89] found id: ""
	I1202 20:01:20.234909  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:20.234970  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:20.238856  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:20.238936  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:20.263865  202120 cri.go:89] found id: ""
	I1202 20:01:20.263891  202120 logs.go:282] 0 containers: []
	W1202 20:01:20.263901  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:20.263907  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:20.263965  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:20.290872  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:20.290898  202120 cri.go:89] found id: ""
	I1202 20:01:20.290906  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:20.290971  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:20.294722  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:20.294798  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:20.323132  202120 cri.go:89] found id: ""
	I1202 20:01:20.323157  202120 logs.go:282] 0 containers: []
	W1202 20:01:20.323166  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:20.323173  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:20.323233  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:20.351337  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:20.351359  202120 cri.go:89] found id: ""
	I1202 20:01:20.351367  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:20.351424  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:20.355217  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:20.355288  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:20.380654  202120 cri.go:89] found id: ""
	I1202 20:01:20.380676  202120 logs.go:282] 0 containers: []
	W1202 20:01:20.380685  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:20.380716  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:20.380775  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:20.410348  202120 cri.go:89] found id: ""
	I1202 20:01:20.410371  202120 logs.go:282] 0 containers: []
	W1202 20:01:20.410379  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:20.410395  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:20.410407  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:20.424652  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:20.424677  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:20.506804  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:20.506829  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:20.506843  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:20.541011  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:20.541047  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:20.576574  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:20.576604  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:20.610079  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:20.610110  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:20.671051  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:20.671084  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:20.707091  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:20.707122  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:20.741853  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:20.741887  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:23.271434  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:23.281889  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:23.281957  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:23.317282  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:23.317300  202120 cri.go:89] found id: ""
	I1202 20:01:23.317307  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:23.317362  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:23.322432  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:23.322502  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:23.355001  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:23.355017  202120 cri.go:89] found id: ""
	I1202 20:01:23.355026  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:23.355074  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:23.359309  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:23.359390  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:23.402674  202120 cri.go:89] found id: ""
	I1202 20:01:23.402695  202120 logs.go:282] 0 containers: []
	W1202 20:01:23.402704  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:23.402711  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:23.402770  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:23.466443  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:23.466462  202120 cri.go:89] found id: ""
	I1202 20:01:23.466470  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:23.466527  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:23.471812  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:23.471884  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:23.513431  202120 cri.go:89] found id: ""
	I1202 20:01:23.513453  202120 logs.go:282] 0 containers: []
	W1202 20:01:23.513462  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:23.513469  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:23.513534  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:23.556395  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:23.556420  202120 cri.go:89] found id: ""
	I1202 20:01:23.556430  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:23.556487  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:23.561055  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:23.561125  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:23.590714  202120 cri.go:89] found id: ""
	I1202 20:01:23.590736  202120 logs.go:282] 0 containers: []
	W1202 20:01:23.590746  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:23.590753  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:23.590811  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:23.624313  202120 cri.go:89] found id: ""
	I1202 20:01:23.624357  202120 logs.go:282] 0 containers: []
	W1202 20:01:23.624366  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:23.624380  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:23.624392  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:23.670834  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:23.670908  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:23.717999  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:23.718070  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:23.788625  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:23.788701  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:23.805812  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:23.805838  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:23.892251  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:23.892269  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:23.892281  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:23.933884  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:23.933949  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:23.970669  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:23.970697  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:24.030765  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:24.030792  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:26.574306  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:26.587550  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:26.587660  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:26.648870  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:26.648902  202120 cri.go:89] found id: ""
	I1202 20:01:26.648916  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:26.648982  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:26.656851  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:26.656930  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:26.711968  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:26.711989  202120 cri.go:89] found id: ""
	I1202 20:01:26.711997  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:26.712074  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:26.721683  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:26.721763  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:26.756261  202120 cri.go:89] found id: ""
	I1202 20:01:26.756298  202120 logs.go:282] 0 containers: []
	W1202 20:01:26.756308  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:26.756315  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:26.756438  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:26.841120  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:26.841206  202120 cri.go:89] found id: ""
	I1202 20:01:26.841235  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:26.841315  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:26.857890  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:26.858024  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:26.916690  202120 cri.go:89] found id: ""
	I1202 20:01:26.916772  202120 logs.go:282] 0 containers: []
	W1202 20:01:26.916802  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:26.916823  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:26.916938  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:26.974933  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:26.975036  202120 cri.go:89] found id: ""
	I1202 20:01:26.975062  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:26.975156  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:26.989127  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:26.989297  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:27.054643  202120 cri.go:89] found id: ""
	I1202 20:01:27.054718  202120 logs.go:282] 0 containers: []
	W1202 20:01:27.054741  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:27.054759  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:27.054830  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:27.113902  202120 cri.go:89] found id: ""
	I1202 20:01:27.113966  202120 logs.go:282] 0 containers: []
	W1202 20:01:27.113989  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:27.114017  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:27.114051  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:27.202551  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:27.202630  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:27.343691  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:27.343709  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:27.343721  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:27.414307  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:27.414382  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:27.461457  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:27.461527  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:27.516423  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:27.516506  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:27.565935  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:27.565970  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:27.617203  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:27.617238  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:27.639271  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:27.639299  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:30.210058  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:30.227298  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:30.227372  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:30.294516  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:30.294536  202120 cri.go:89] found id: ""
	I1202 20:01:30.294544  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:30.294612  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:30.305264  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:30.305339  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:30.349189  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:30.349209  202120 cri.go:89] found id: ""
	I1202 20:01:30.349217  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:30.349278  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:30.361344  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:30.361416  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:30.407330  202120 cri.go:89] found id: ""
	I1202 20:01:30.407351  202120 logs.go:282] 0 containers: []
	W1202 20:01:30.407359  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:30.407365  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:30.407423  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:30.456936  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:30.456955  202120 cri.go:89] found id: ""
	I1202 20:01:30.456963  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:30.457021  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:30.464991  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:30.465166  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:30.515196  202120 cri.go:89] found id: ""
	I1202 20:01:30.515218  202120 logs.go:282] 0 containers: []
	W1202 20:01:30.515228  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:30.515235  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:30.515293  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:30.569043  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:30.569063  202120 cri.go:89] found id: ""
	I1202 20:01:30.569071  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:30.569128  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:30.572995  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:30.573065  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:30.600277  202120 cri.go:89] found id: ""
	I1202 20:01:30.600388  202120 logs.go:282] 0 containers: []
	W1202 20:01:30.600412  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:30.600432  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:30.600537  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:30.642865  202120 cri.go:89] found id: ""
	I1202 20:01:30.642886  202120 logs.go:282] 0 containers: []
	W1202 20:01:30.642895  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:30.642915  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:30.642926  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:30.713641  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:30.713675  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:30.779797  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:30.779837  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:30.822444  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:30.822477  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:30.854993  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:30.855029  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:30.943876  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:30.943915  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:31.045172  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:31.045202  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:31.059183  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:31.059214  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:31.156090  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:31.156114  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:31.156128  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:33.706776  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:33.717055  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:33.717127  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:33.742078  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:33.742150  202120 cri.go:89] found id: ""
	I1202 20:01:33.742172  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:33.742274  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:33.746137  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:33.746208  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:33.771554  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:33.771576  202120 cri.go:89] found id: ""
	I1202 20:01:33.771586  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:33.771644  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:33.775604  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:33.775685  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:33.802985  202120 cri.go:89] found id: ""
	I1202 20:01:33.803010  202120 logs.go:282] 0 containers: []
	W1202 20:01:33.803018  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:33.803026  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:33.803085  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:33.831102  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:33.831123  202120 cri.go:89] found id: ""
	I1202 20:01:33.831134  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:33.831192  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:33.834979  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:33.835055  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:33.859772  202120 cri.go:89] found id: ""
	I1202 20:01:33.859797  202120 logs.go:282] 0 containers: []
	W1202 20:01:33.859806  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:33.859812  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:33.859876  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:33.885836  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:33.885860  202120 cri.go:89] found id: ""
	I1202 20:01:33.885868  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:33.885950  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:33.889648  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:33.889747  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:33.914594  202120 cri.go:89] found id: ""
	I1202 20:01:33.914668  202120 logs.go:282] 0 containers: []
	W1202 20:01:33.914691  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:33.914711  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:33.914794  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:33.944294  202120 cri.go:89] found id: ""
	I1202 20:01:33.944347  202120 logs.go:282] 0 containers: []
	W1202 20:01:33.944357  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:33.944391  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:33.944410  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:34.008610  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:34.008654  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:34.077842  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:34.077868  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:34.077884  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:34.114420  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:34.114454  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:34.146990  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:34.147019  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:34.180706  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:34.180738  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:34.193517  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:34.193548  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:34.225971  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:34.226003  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:34.262946  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:34.262978  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:36.793069  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:36.803604  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:36.803684  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:36.830226  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:36.830250  202120 cri.go:89] found id: ""
	I1202 20:01:36.830259  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:36.830320  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:36.834138  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:36.834210  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:36.863453  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:36.863478  202120 cri.go:89] found id: ""
	I1202 20:01:36.863486  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:36.863543  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:36.867520  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:36.867596  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:36.893433  202120 cri.go:89] found id: ""
	I1202 20:01:36.893462  202120 logs.go:282] 0 containers: []
	W1202 20:01:36.893471  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:36.893478  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:36.893544  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:36.921691  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:36.921715  202120 cri.go:89] found id: ""
	I1202 20:01:36.921724  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:36.921805  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:36.926502  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:36.926661  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:36.962611  202120 cri.go:89] found id: ""
	I1202 20:01:36.962635  202120 logs.go:282] 0 containers: []
	W1202 20:01:36.962645  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:36.962671  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:36.962750  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:36.990374  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:36.990397  202120 cri.go:89] found id: ""
	I1202 20:01:36.990405  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:36.990483  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:36.996144  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:36.996265  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:37.027622  202120 cri.go:89] found id: ""
	I1202 20:01:37.027648  202120 logs.go:282] 0 containers: []
	W1202 20:01:37.027656  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:37.027664  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:37.027767  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:37.055164  202120 cri.go:89] found id: ""
	I1202 20:01:37.055189  202120 logs.go:282] 0 containers: []
	W1202 20:01:37.055198  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:37.055242  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:37.055262  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:37.083765  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:37.083794  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:37.141886  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:37.141920  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:37.155080  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:37.155108  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:37.220814  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:37.220837  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:37.220851  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:37.263282  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:37.263312  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:37.302328  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:37.302357  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:37.335051  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:37.335082  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:37.369609  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:37.369643  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:39.904690  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:39.915279  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:39.915352  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:39.948841  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:39.948865  202120 cri.go:89] found id: ""
	I1202 20:01:39.948873  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:39.948937  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:39.953396  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:39.953476  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:39.987640  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:39.987663  202120 cri.go:89] found id: ""
	I1202 20:01:39.987672  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:39.987727  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:39.991774  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:39.991842  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:40.038635  202120 cri.go:89] found id: ""
	I1202 20:01:40.038659  202120 logs.go:282] 0 containers: []
	W1202 20:01:40.038668  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:40.038676  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:40.038742  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:40.068169  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:40.068245  202120 cri.go:89] found id: ""
	I1202 20:01:40.068271  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:40.068391  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:40.072692  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:40.072818  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:40.100498  202120 cri.go:89] found id: ""
	I1202 20:01:40.100523  202120 logs.go:282] 0 containers: []
	W1202 20:01:40.100533  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:40.100541  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:40.100636  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:40.128815  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:40.128851  202120 cri.go:89] found id: ""
	I1202 20:01:40.128862  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:40.128941  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:40.133248  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:40.133326  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:40.160713  202120 cri.go:89] found id: ""
	I1202 20:01:40.160742  202120 logs.go:282] 0 containers: []
	W1202 20:01:40.160751  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:40.160758  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:40.160820  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:40.187671  202120 cri.go:89] found id: ""
	I1202 20:01:40.187698  202120 logs.go:282] 0 containers: []
	W1202 20:01:40.187707  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:40.187722  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:40.187734  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:40.223872  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:40.223909  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:40.257054  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:40.257087  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:40.295537  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:40.295570  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:40.330531  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:40.330566  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:40.364185  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:40.364215  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:40.424638  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:40.424669  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:40.496316  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:40.496384  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:40.496397  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:40.528986  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:40.529018  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:43.042284  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:43.052764  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:43.052845  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:43.077794  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:43.077816  202120 cri.go:89] found id: ""
	I1202 20:01:43.077824  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:43.077911  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:43.081781  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:43.081857  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:43.107229  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:43.107252  202120 cri.go:89] found id: ""
	I1202 20:01:43.107260  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:43.107318  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:43.111102  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:43.111181  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:43.136947  202120 cri.go:89] found id: ""
	I1202 20:01:43.136972  202120 logs.go:282] 0 containers: []
	W1202 20:01:43.136981  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:43.136987  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:43.137049  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:43.163143  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:43.163165  202120 cri.go:89] found id: ""
	I1202 20:01:43.163174  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:43.163232  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:43.166964  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:43.167035  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:43.191839  202120 cri.go:89] found id: ""
	I1202 20:01:43.191863  202120 logs.go:282] 0 containers: []
	W1202 20:01:43.191872  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:43.191878  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:43.191936  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:43.215779  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:43.215802  202120 cri.go:89] found id: ""
	I1202 20:01:43.215810  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:43.215873  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:43.219532  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:43.219604  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:43.244889  202120 cri.go:89] found id: ""
	I1202 20:01:43.244913  202120 logs.go:282] 0 containers: []
	W1202 20:01:43.244922  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:43.244928  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:43.244985  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:43.279716  202120 cri.go:89] found id: ""
	I1202 20:01:43.279742  202120 logs.go:282] 0 containers: []
	W1202 20:01:43.279751  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:43.279764  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:43.279775  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:43.313813  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:43.313846  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:43.373358  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:43.373390  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:43.438790  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:43.438810  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:43.438822  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:43.472191  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:43.472221  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:43.504449  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:43.504481  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:43.536845  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:43.536875  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:43.568732  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:43.568766  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:43.599494  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:43.599533  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:46.113168  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:46.123746  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:46.123820  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:46.149523  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:46.149546  202120 cri.go:89] found id: ""
	I1202 20:01:46.149555  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:46.149618  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:46.153401  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:46.153473  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:46.178840  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:46.178861  202120 cri.go:89] found id: ""
	I1202 20:01:46.178869  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:46.178925  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:46.182689  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:46.182756  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:46.209747  202120 cri.go:89] found id: ""
	I1202 20:01:46.209774  202120 logs.go:282] 0 containers: []
	W1202 20:01:46.209782  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:46.209789  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:46.209848  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:46.237006  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:46.237029  202120 cri.go:89] found id: ""
	I1202 20:01:46.237037  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:46.237121  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:46.240849  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:46.240925  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:46.265349  202120 cri.go:89] found id: ""
	I1202 20:01:46.265371  202120 logs.go:282] 0 containers: []
	W1202 20:01:46.265379  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:46.265386  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:46.265443  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:46.290763  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:46.290782  202120 cri.go:89] found id: ""
	I1202 20:01:46.290790  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:46.290845  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:46.294680  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:46.294751  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:46.319909  202120 cri.go:89] found id: ""
	I1202 20:01:46.319933  202120 logs.go:282] 0 containers: []
	W1202 20:01:46.319942  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:46.319948  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:46.320010  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:46.349431  202120 cri.go:89] found id: ""
	I1202 20:01:46.349457  202120 logs.go:282] 0 containers: []
	W1202 20:01:46.349467  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:46.349481  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:46.349492  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:46.388458  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:46.388489  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:46.421701  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:46.421734  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:46.482575  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:46.482638  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:46.495407  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:46.495434  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:46.529941  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:46.529970  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:46.567251  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:46.567282  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:46.597224  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:46.597252  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:46.664026  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:46.664047  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:46.664060  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:49.205868  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:49.216440  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:49.216527  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:49.243701  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:49.243724  202120 cri.go:89] found id: ""
	I1202 20:01:49.243732  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:49.243790  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:49.247537  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:49.247611  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:49.273039  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:49.273062  202120 cri.go:89] found id: ""
	I1202 20:01:49.273071  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:49.273134  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:49.277010  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:49.277088  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:49.302932  202120 cri.go:89] found id: ""
	I1202 20:01:49.302956  202120 logs.go:282] 0 containers: []
	W1202 20:01:49.302965  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:49.302972  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:49.303032  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:49.332221  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:49.332243  202120 cri.go:89] found id: ""
	I1202 20:01:49.332252  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:49.332366  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:49.336440  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:49.336541  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:49.361669  202120 cri.go:89] found id: ""
	I1202 20:01:49.361695  202120 logs.go:282] 0 containers: []
	W1202 20:01:49.361704  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:49.361711  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:49.361770  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:49.387743  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:49.387768  202120 cri.go:89] found id: ""
	I1202 20:01:49.387777  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:49.387841  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:49.391684  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:49.391779  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:49.419861  202120 cri.go:89] found id: ""
	I1202 20:01:49.419887  202120 logs.go:282] 0 containers: []
	W1202 20:01:49.419897  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:49.419904  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:49.419986  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:49.453010  202120 cri.go:89] found id: ""
	I1202 20:01:49.453082  202120 logs.go:282] 0 containers: []
	W1202 20:01:49.453111  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:49.453132  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:49.453159  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:49.514646  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:49.514681  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:49.550974  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:49.551006  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:49.585327  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:49.585360  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:49.617459  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:49.617491  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:49.632251  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:49.632282  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:49.712681  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:49.712704  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:49.712718  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:49.776535  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:49.776571  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:49.810708  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:49.810745  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:52.344506  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:52.355868  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:52.355955  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:52.383079  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:52.383108  202120 cri.go:89] found id: ""
	I1202 20:01:52.383117  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:52.383192  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:52.386786  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:52.386887  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:52.413871  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:52.413893  202120 cri.go:89] found id: ""
	I1202 20:01:52.413902  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:52.414007  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:52.417941  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:52.418049  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:52.444263  202120 cri.go:89] found id: ""
	I1202 20:01:52.444288  202120 logs.go:282] 0 containers: []
	W1202 20:01:52.444298  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:52.444304  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:52.444394  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:52.470676  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:52.470699  202120 cri.go:89] found id: ""
	I1202 20:01:52.470707  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:52.470770  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:52.474840  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:52.474956  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:52.505344  202120 cri.go:89] found id: ""
	I1202 20:01:52.505381  202120 logs.go:282] 0 containers: []
	W1202 20:01:52.505391  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:52.505398  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:52.505457  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:52.531466  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:52.531490  202120 cri.go:89] found id: ""
	I1202 20:01:52.531499  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:52.531558  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:52.535372  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:52.535456  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:52.561395  202120 cri.go:89] found id: ""
	I1202 20:01:52.561419  202120 logs.go:282] 0 containers: []
	W1202 20:01:52.561428  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:52.561434  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:52.561498  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:52.586314  202120 cri.go:89] found id: ""
	I1202 20:01:52.586342  202120 logs.go:282] 0 containers: []
	W1202 20:01:52.586352  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:52.586365  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:52.586377  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:52.630400  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:52.630429  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:52.652312  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:52.652396  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:52.751092  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:52.751116  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:52.751130  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:52.789375  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:52.789407  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:52.820944  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:52.820974  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:52.855210  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:52.855246  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:52.929018  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:52.929093  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:52.984375  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:52.984450  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:55.536456  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:55.547706  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:55.547775  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:55.590966  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:55.590986  202120 cri.go:89] found id: ""
	I1202 20:01:55.590994  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:55.591058  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:55.595217  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:55.595289  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:55.643848  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:55.643921  202120 cri.go:89] found id: ""
	I1202 20:01:55.643944  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:55.644037  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:55.648574  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:55.648698  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:55.690818  202120 cri.go:89] found id: ""
	I1202 20:01:55.690894  202120 logs.go:282] 0 containers: []
	W1202 20:01:55.690917  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:55.690941  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:55.691053  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:55.718128  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:55.718199  202120 cri.go:89] found id: ""
	I1202 20:01:55.718221  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:55.718322  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:55.725378  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:55.725454  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:55.755215  202120 cri.go:89] found id: ""
	I1202 20:01:55.755242  202120 logs.go:282] 0 containers: []
	W1202 20:01:55.755251  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:55.755258  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:55.755320  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:55.789515  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:55.789540  202120 cri.go:89] found id: ""
	I1202 20:01:55.789549  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:55.789614  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:55.793470  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:55.793560  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:55.818612  202120 cri.go:89] found id: ""
	I1202 20:01:55.818639  202120 logs.go:282] 0 containers: []
	W1202 20:01:55.818659  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:55.818668  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:55.818746  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:55.854360  202120 cri.go:89] found id: ""
	I1202 20:01:55.854385  202120 logs.go:282] 0 containers: []
	W1202 20:01:55.854393  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:55.854408  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:55.854420  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:55.868436  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:55.868464  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:55.900686  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:55.900718  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:55.933321  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:55.933356  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:55.962516  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:55.962542  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:01:56.023072  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:56.023110  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:56.104801  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:56.104825  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:56.104840  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:56.157452  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:56.157485  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:56.190183  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:56.190215  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:58.740497  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:01:58.751515  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:01:58.751596  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:01:58.777058  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:58.777078  202120 cri.go:89] found id: ""
	I1202 20:01:58.777087  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:01:58.777144  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:58.781008  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:01:58.781085  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:01:58.806986  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:58.807010  202120 cri.go:89] found id: ""
	I1202 20:01:58.807019  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:01:58.807082  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:58.810951  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:01:58.811029  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:01:58.837186  202120 cri.go:89] found id: ""
	I1202 20:01:58.837211  202120 logs.go:282] 0 containers: []
	W1202 20:01:58.837220  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:01:58.837227  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:01:58.837287  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:01:58.867401  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:58.867425  202120 cri.go:89] found id: ""
	I1202 20:01:58.867433  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:01:58.867498  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:58.871395  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:01:58.871469  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:01:58.898400  202120 cri.go:89] found id: ""
	I1202 20:01:58.898425  202120 logs.go:282] 0 containers: []
	W1202 20:01:58.898434  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:01:58.898440  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:01:58.898499  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:01:58.926281  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:58.926304  202120 cri.go:89] found id: ""
	I1202 20:01:58.926312  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:01:58.926372  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:01:58.930303  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:01:58.930385  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:01:58.960404  202120 cri.go:89] found id: ""
	I1202 20:01:58.960436  202120 logs.go:282] 0 containers: []
	W1202 20:01:58.960445  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:01:58.960457  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:01:58.960520  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:01:58.990345  202120 cri.go:89] found id: ""
	I1202 20:01:58.990373  202120 logs.go:282] 0 containers: []
	W1202 20:01:58.990383  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:01:58.990397  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:01:58.990409  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:01:59.003908  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:01:59.004008  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:01:59.042418  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:01:59.042452  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:01:59.074656  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:01:59.074688  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:01:59.109009  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:01:59.109088  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:01:59.176691  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:01:59.176716  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:01:59.176730  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:01:59.209595  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:01:59.209625  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:01:59.260235  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:01:59.260273  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:01:59.290475  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:01:59.290505  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:01.849017  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:01.859493  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:01.859562  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:01.887373  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:01.887396  202120 cri.go:89] found id: ""
	I1202 20:02:01.887404  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:01.887465  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:01.891455  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:01.891536  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:01.917567  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:01.917589  202120 cri.go:89] found id: ""
	I1202 20:02:01.917597  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:01.917656  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:01.921658  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:01.921757  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:01.947143  202120 cri.go:89] found id: ""
	I1202 20:02:01.947168  202120 logs.go:282] 0 containers: []
	W1202 20:02:01.947177  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:01.947183  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:01.947244  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:01.981030  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:01.981053  202120 cri.go:89] found id: ""
	I1202 20:02:01.981061  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:01.981120  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:01.984967  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:01.985039  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:02.016181  202120 cri.go:89] found id: ""
	I1202 20:02:02.016210  202120 logs.go:282] 0 containers: []
	W1202 20:02:02.016220  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:02.016246  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:02.016355  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:02.043604  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:02.043627  202120 cri.go:89] found id: ""
	I1202 20:02:02.043641  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:02.043712  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:02.047597  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:02.047672  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:02.073536  202120 cri.go:89] found id: ""
	I1202 20:02:02.073609  202120 logs.go:282] 0 containers: []
	W1202 20:02:02.073627  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:02.073634  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:02.073703  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:02.099133  202120 cri.go:89] found id: ""
	I1202 20:02:02.099156  202120 logs.go:282] 0 containers: []
	W1202 20:02:02.099165  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:02.099181  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:02.099192  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:02.159323  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:02.159360  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:02.172549  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:02.172574  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:02.210924  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:02.210955  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:02.245002  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:02.245038  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:02.281600  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:02.281629  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:02.319770  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:02.319808  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:02.354371  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:02.354404  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:02.382241  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:02.382269  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:02.459946  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:04.961663  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:04.972424  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:04.972500  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:04.999729  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:04.999748  202120 cri.go:89] found id: ""
	I1202 20:02:04.999756  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:04.999816  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:05.004556  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:05.004640  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:05.035033  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:05.035055  202120 cri.go:89] found id: ""
	I1202 20:02:05.035065  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:05.035126  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:05.039133  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:05.039243  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:05.065663  202120 cri.go:89] found id: ""
	I1202 20:02:05.065688  202120 logs.go:282] 0 containers: []
	W1202 20:02:05.065697  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:05.065703  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:05.065766  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:05.092209  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:05.092242  202120 cri.go:89] found id: ""
	I1202 20:02:05.092251  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:05.092343  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:05.096446  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:05.096529  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:05.127566  202120 cri.go:89] found id: ""
	I1202 20:02:05.127590  202120 logs.go:282] 0 containers: []
	W1202 20:02:05.127599  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:05.127607  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:05.127675  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:05.156126  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:05.156152  202120 cri.go:89] found id: ""
	I1202 20:02:05.156161  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:05.156270  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:05.160306  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:05.160403  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:05.190565  202120 cri.go:89] found id: ""
	I1202 20:02:05.190592  202120 logs.go:282] 0 containers: []
	W1202 20:02:05.190601  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:05.190607  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:05.190674  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:05.220208  202120 cri.go:89] found id: ""
	I1202 20:02:05.220232  202120 logs.go:282] 0 containers: []
	W1202 20:02:05.220240  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:05.220255  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:05.220267  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:05.254664  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:05.254699  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:05.292495  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:05.292535  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:05.332127  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:05.332165  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:05.394868  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:05.394905  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:05.408369  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:05.408394  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:05.470320  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:05.470393  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:05.510050  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:05.510126  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:05.550980  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:05.551008  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:05.619015  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:08.120456  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:08.130876  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:08.130954  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:08.161886  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:08.161908  202120 cri.go:89] found id: ""
	I1202 20:02:08.161916  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:08.161978  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:08.166202  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:08.166303  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:08.192181  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:08.192206  202120 cri.go:89] found id: ""
	I1202 20:02:08.192214  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:08.192273  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:08.196166  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:08.196241  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:08.223536  202120 cri.go:89] found id: ""
	I1202 20:02:08.223560  202120 logs.go:282] 0 containers: []
	W1202 20:02:08.223584  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:08.223599  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:08.223677  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:08.250701  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:08.250724  202120 cri.go:89] found id: ""
	I1202 20:02:08.250733  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:08.250822  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:08.254816  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:08.254906  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:08.286001  202120 cri.go:89] found id: ""
	I1202 20:02:08.286028  202120 logs.go:282] 0 containers: []
	W1202 20:02:08.286037  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:08.286044  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:08.286114  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:08.312660  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:08.312683  202120 cri.go:89] found id: ""
	I1202 20:02:08.312691  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:08.312753  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:08.316728  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:08.316841  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:08.343627  202120 cri.go:89] found id: ""
	I1202 20:02:08.343654  202120 logs.go:282] 0 containers: []
	W1202 20:02:08.343663  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:08.343669  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:08.343733  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:08.368962  202120 cri.go:89] found id: ""
	I1202 20:02:08.368988  202120 logs.go:282] 0 containers: []
	W1202 20:02:08.368997  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:08.369012  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:08.369024  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:08.401593  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:08.401629  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:08.434582  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:08.434610  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:08.510066  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:08.510085  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:08.510099  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:08.544031  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:08.544063  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:08.575669  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:08.575697  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:08.633311  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:08.633347  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:08.650154  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:08.650186  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:08.685804  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:08.685838  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:11.237232  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:11.247829  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:11.247901  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:11.273810  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:11.273833  202120 cri.go:89] found id: ""
	I1202 20:02:11.273842  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:11.273922  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:11.277704  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:11.277780  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:11.304132  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:11.304158  202120 cri.go:89] found id: ""
	I1202 20:02:11.304167  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:11.304253  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:11.307989  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:11.308061  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:11.333761  202120 cri.go:89] found id: ""
	I1202 20:02:11.333788  202120 logs.go:282] 0 containers: []
	W1202 20:02:11.333798  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:11.333806  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:11.333865  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:11.358431  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:11.358453  202120 cri.go:89] found id: ""
	I1202 20:02:11.358462  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:11.358538  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:11.362416  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:11.362536  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:11.387695  202120 cri.go:89] found id: ""
	I1202 20:02:11.387720  202120 logs.go:282] 0 containers: []
	W1202 20:02:11.387729  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:11.387736  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:11.387794  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:11.415107  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:11.415131  202120 cri.go:89] found id: ""
	I1202 20:02:11.415140  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:11.415199  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:11.421307  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:11.421379  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:11.451662  202120 cri.go:89] found id: ""
	I1202 20:02:11.451685  202120 logs.go:282] 0 containers: []
	W1202 20:02:11.451694  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:11.451700  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:11.451760  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:11.487149  202120 cri.go:89] found id: ""
	I1202 20:02:11.487172  202120 logs.go:282] 0 containers: []
	W1202 20:02:11.487180  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:11.487193  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:11.487206  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:11.546433  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:11.546470  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:11.580668  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:11.580698  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:11.615019  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:11.615054  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:11.644870  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:11.644898  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:11.658702  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:11.658730  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:11.726256  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:11.726277  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:11.726291  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:11.770843  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:11.770873  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:11.804070  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:11.804142  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:14.363748  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:14.374271  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:14.374370  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:14.403074  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:14.403135  202120 cri.go:89] found id: ""
	I1202 20:02:14.403157  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:14.403232  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:14.407111  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:14.407231  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:14.441960  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:14.442022  202120 cri.go:89] found id: ""
	I1202 20:02:14.442043  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:14.442119  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:14.446762  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:14.447106  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:14.490742  202120 cri.go:89] found id: ""
	I1202 20:02:14.490809  202120 logs.go:282] 0 containers: []
	W1202 20:02:14.490831  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:14.490849  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:14.490934  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:14.517357  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:14.517379  202120 cri.go:89] found id: ""
	I1202 20:02:14.517388  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:14.517466  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:14.521176  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:14.521247  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:14.548992  202120 cri.go:89] found id: ""
	I1202 20:02:14.549021  202120 logs.go:282] 0 containers: []
	W1202 20:02:14.549035  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:14.549042  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:14.549103  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:14.574700  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:14.574723  202120 cri.go:89] found id: ""
	I1202 20:02:14.574731  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:14.574818  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:14.578587  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:14.578704  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:14.604378  202120 cri.go:89] found id: ""
	I1202 20:02:14.604405  202120 logs.go:282] 0 containers: []
	W1202 20:02:14.604414  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:14.604421  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:14.604484  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:14.629817  202120 cri.go:89] found id: ""
	I1202 20:02:14.629843  202120 logs.go:282] 0 containers: []
	W1202 20:02:14.629852  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:14.629868  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:14.629880  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:14.642729  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:14.642758  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:14.712762  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:14.712782  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:14.712800  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:14.749825  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:14.749854  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:14.784413  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:14.784486  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:14.827123  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:14.827153  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:14.861178  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:14.861214  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:14.892511  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:14.892539  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:14.952748  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:14.952782  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:17.488452  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:17.498627  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:17.498695  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:17.524223  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:17.524246  202120 cri.go:89] found id: ""
	I1202 20:02:17.524254  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:17.524315  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:17.528220  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:17.528292  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:17.553832  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:17.553854  202120 cri.go:89] found id: ""
	I1202 20:02:17.553861  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:17.553921  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:17.557685  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:17.557760  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:17.582897  202120 cri.go:89] found id: ""
	I1202 20:02:17.582923  202120 logs.go:282] 0 containers: []
	W1202 20:02:17.582932  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:17.582939  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:17.583008  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:17.613612  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:17.613634  202120 cri.go:89] found id: ""
	I1202 20:02:17.613642  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:17.613726  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:17.617378  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:17.617449  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:17.646030  202120 cri.go:89] found id: ""
	I1202 20:02:17.646054  202120 logs.go:282] 0 containers: []
	W1202 20:02:17.646063  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:17.646069  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:17.646126  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:17.675459  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:17.675479  202120 cri.go:89] found id: ""
	I1202 20:02:17.675487  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:17.675544  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:17.679628  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:17.679715  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:17.709496  202120 cri.go:89] found id: ""
	I1202 20:02:17.709522  202120 logs.go:282] 0 containers: []
	W1202 20:02:17.709530  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:17.709536  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:17.709594  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:17.734783  202120 cri.go:89] found id: ""
	I1202 20:02:17.734809  202120 logs.go:282] 0 containers: []
	W1202 20:02:17.734818  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:17.734831  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:17.734842  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:17.764012  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:17.764040  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:17.822711  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:17.822742  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:17.887009  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:17.887033  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:17.887046  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:17.930762  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:17.930793  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:17.963460  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:17.963494  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:17.994798  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:17.994825  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:18.028134  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:18.028167  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:18.043812  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:18.043849  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:20.584490  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:20.594630  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:20.594704  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:20.620618  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:20.620639  202120 cri.go:89] found id: ""
	I1202 20:02:20.620647  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:20.620704  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:20.624574  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:20.624651  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:20.650980  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:20.651006  202120 cri.go:89] found id: ""
	I1202 20:02:20.651013  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:20.651074  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:20.655075  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:20.655149  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:20.684053  202120 cri.go:89] found id: ""
	I1202 20:02:20.684080  202120 logs.go:282] 0 containers: []
	W1202 20:02:20.684089  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:20.684095  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:20.684156  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:20.710254  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:20.710278  202120 cri.go:89] found id: ""
	I1202 20:02:20.710286  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:20.710349  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:20.714143  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:20.714221  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:20.743202  202120 cri.go:89] found id: ""
	I1202 20:02:20.743275  202120 logs.go:282] 0 containers: []
	W1202 20:02:20.743298  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:20.743317  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:20.743404  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:20.769628  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:20.769652  202120 cri.go:89] found id: ""
	I1202 20:02:20.769660  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:20.769717  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:20.773563  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:20.773638  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:20.799108  202120 cri.go:89] found id: ""
	I1202 20:02:20.799130  202120 logs.go:282] 0 containers: []
	W1202 20:02:20.799139  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:20.799145  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:20.799205  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:20.824268  202120 cri.go:89] found id: ""
	I1202 20:02:20.824293  202120 logs.go:282] 0 containers: []
	W1202 20:02:20.824302  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:20.824348  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:20.824361  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:20.881845  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:20.881879  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:20.919267  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:20.919297  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:20.971064  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:20.971098  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:21.003713  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:21.003753  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:21.019313  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:21.019342  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:21.084110  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:21.084133  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:21.084146  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:21.125282  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:21.125320  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:21.161359  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:21.161387  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:23.696165  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:23.706707  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:23.706781  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:23.732616  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:23.732638  202120 cri.go:89] found id: ""
	I1202 20:02:23.732647  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:23.732710  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:23.736587  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:23.736660  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:23.771033  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:23.771058  202120 cri.go:89] found id: ""
	I1202 20:02:23.771066  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:23.771131  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:23.774954  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:23.775031  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:23.801108  202120 cri.go:89] found id: ""
	I1202 20:02:23.801133  202120 logs.go:282] 0 containers: []
	W1202 20:02:23.801142  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:23.801149  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:23.801233  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:23.827566  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:23.827636  202120 cri.go:89] found id: ""
	I1202 20:02:23.827670  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:23.827754  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:23.832161  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:23.832264  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:23.857441  202120 cri.go:89] found id: ""
	I1202 20:02:23.857464  202120 logs.go:282] 0 containers: []
	W1202 20:02:23.857473  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:23.857479  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:23.857538  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:23.884004  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:23.884027  202120 cri.go:89] found id: ""
	I1202 20:02:23.884035  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:23.884093  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:23.887989  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:23.888079  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:23.913710  202120 cri.go:89] found id: ""
	I1202 20:02:23.913741  202120 logs.go:282] 0 containers: []
	W1202 20:02:23.913750  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:23.913757  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:23.913819  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:23.939213  202120 cri.go:89] found id: ""
	I1202 20:02:23.939238  202120 logs.go:282] 0 containers: []
	W1202 20:02:23.939247  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:23.939260  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:23.939272  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:23.952270  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:23.952300  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:23.986258  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:23.986292  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:24.024576  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:24.024653  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:24.057206  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:24.057237  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:24.116062  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:24.116111  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:24.191928  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:24.191948  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:24.191961  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:24.234322  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:24.234400  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:24.273232  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:24.273260  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:26.803135  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:26.814742  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:26.814819  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:26.847242  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:26.847262  202120 cri.go:89] found id: ""
	I1202 20:02:26.847274  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:26.847329  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:26.851525  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:26.851597  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:26.883472  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:26.883492  202120 cri.go:89] found id: ""
	I1202 20:02:26.883500  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:26.883553  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:26.893927  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:26.894004  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:26.922490  202120 cri.go:89] found id: ""
	I1202 20:02:26.922514  202120 logs.go:282] 0 containers: []
	W1202 20:02:26.922524  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:26.922531  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:26.922596  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:26.977123  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:26.977145  202120 cri.go:89] found id: ""
	I1202 20:02:26.977154  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:26.977217  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:26.981582  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:26.981672  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:27.019015  202120 cri.go:89] found id: ""
	I1202 20:02:27.019037  202120 logs.go:282] 0 containers: []
	W1202 20:02:27.019046  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:27.019053  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:27.019119  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:27.050255  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:27.050278  202120 cri.go:89] found id: ""
	I1202 20:02:27.050287  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:27.050344  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:27.054466  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:27.054539  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:27.080732  202120 cri.go:89] found id: ""
	I1202 20:02:27.080755  202120 logs.go:282] 0 containers: []
	W1202 20:02:27.080764  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:27.080770  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:27.080831  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:27.112361  202120 cri.go:89] found id: ""
	I1202 20:02:27.112387  202120 logs.go:282] 0 containers: []
	W1202 20:02:27.112396  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:27.112412  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:27.112423  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:27.182314  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:27.188445  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:27.230511  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:27.230539  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:27.275868  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:27.275904  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:27.314620  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:27.314649  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:27.394781  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:27.394804  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:27.394819  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:27.440729  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:27.440766  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:27.504470  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:27.504504  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:27.556894  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:27.556932  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:30.092485  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:30.104081  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:30.104164  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:30.137151  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:30.137177  202120 cri.go:89] found id: ""
	I1202 20:02:30.137186  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:30.137249  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:30.141452  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:30.141545  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:30.176358  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:30.176379  202120 cri.go:89] found id: ""
	I1202 20:02:30.176386  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:30.176448  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:30.181317  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:30.181396  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:30.218146  202120 cri.go:89] found id: ""
	I1202 20:02:30.218225  202120 logs.go:282] 0 containers: []
	W1202 20:02:30.218249  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:30.218269  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:30.218344  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:30.250250  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:30.250274  202120 cri.go:89] found id: ""
	I1202 20:02:30.250283  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:30.250340  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:30.254244  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:30.254320  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:30.281096  202120 cri.go:89] found id: ""
	I1202 20:02:30.281122  202120 logs.go:282] 0 containers: []
	W1202 20:02:30.281131  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:30.281138  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:30.281203  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:30.306016  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:30.306041  202120 cri.go:89] found id: ""
	I1202 20:02:30.306051  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:30.306109  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:30.309861  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:30.309936  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:30.338904  202120 cri.go:89] found id: ""
	I1202 20:02:30.338926  202120 logs.go:282] 0 containers: []
	W1202 20:02:30.338934  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:30.338941  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:30.339000  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:30.364744  202120 cri.go:89] found id: ""
	I1202 20:02:30.364767  202120 logs.go:282] 0 containers: []
	W1202 20:02:30.364779  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:30.364793  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:30.364806  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:30.440425  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:30.440451  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:30.440465  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:30.474898  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:30.474933  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:30.507405  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:30.507436  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:30.540905  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:30.540939  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:30.569892  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:30.569924  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:30.617444  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:30.617474  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:30.651870  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:30.651903  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:30.713565  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:30.713602  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:33.228434  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:33.240659  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:33.240732  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:33.277698  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:33.277718  202120 cri.go:89] found id: ""
	I1202 20:02:33.277726  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:33.277789  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:33.286981  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:33.287056  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:33.318607  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:33.318626  202120 cri.go:89] found id: ""
	I1202 20:02:33.318634  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:33.318691  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:33.322404  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:33.322476  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:33.354861  202120 cri.go:89] found id: ""
	I1202 20:02:33.354883  202120 logs.go:282] 0 containers: []
	W1202 20:02:33.354891  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:33.354899  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:33.354971  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:33.388207  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:33.388227  202120 cri.go:89] found id: ""
	I1202 20:02:33.388235  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:33.388307  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:33.392186  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:33.392254  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:33.419539  202120 cri.go:89] found id: ""
	I1202 20:02:33.419569  202120 logs.go:282] 0 containers: []
	W1202 20:02:33.419578  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:33.419584  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:33.419641  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:33.445053  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:33.445080  202120 cri.go:89] found id: ""
	I1202 20:02:33.445088  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:33.445148  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:33.448941  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:33.449021  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:33.473829  202120 cri.go:89] found id: ""
	I1202 20:02:33.473911  202120 logs.go:282] 0 containers: []
	W1202 20:02:33.473934  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:33.473952  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:33.474025  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:33.499224  202120 cri.go:89] found id: ""
	I1202 20:02:33.499247  202120 logs.go:282] 0 containers: []
	W1202 20:02:33.499256  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:33.499272  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:33.499286  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:33.531712  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:33.531746  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:33.568393  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:33.568430  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:33.604696  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:33.604731  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:33.635462  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:33.635491  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:33.673314  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:33.673344  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:33.731431  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:33.731466  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:33.744568  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:33.744595  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:33.812184  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:33.812208  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:33.812221  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:36.349716  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:36.361115  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:36.361193  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:36.405349  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:36.405376  202120 cri.go:89] found id: ""
	I1202 20:02:36.405385  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:36.405444  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:36.415759  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:36.415836  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:36.450715  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:36.450741  202120 cri.go:89] found id: ""
	I1202 20:02:36.450750  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:36.450810  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:36.455530  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:36.455615  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:36.501452  202120 cri.go:89] found id: ""
	I1202 20:02:36.501475  202120 logs.go:282] 0 containers: []
	W1202 20:02:36.501484  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:36.501490  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:36.501553  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:36.546611  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:36.546630  202120 cri.go:89] found id: ""
	I1202 20:02:36.546638  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:36.546694  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:36.551050  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:36.551118  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:36.585778  202120 cri.go:89] found id: ""
	I1202 20:02:36.585799  202120 logs.go:282] 0 containers: []
	W1202 20:02:36.585807  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:36.585814  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:36.585872  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:36.617956  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:36.617974  202120 cri.go:89] found id: ""
	I1202 20:02:36.617982  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:36.618046  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:36.624018  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:36.624101  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:36.653069  202120 cri.go:89] found id: ""
	I1202 20:02:36.653096  202120 logs.go:282] 0 containers: []
	W1202 20:02:36.653105  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:36.653112  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:36.653176  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:36.698429  202120 cri.go:89] found id: ""
	I1202 20:02:36.698454  202120 logs.go:282] 0 containers: []
	W1202 20:02:36.698464  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:36.698478  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:36.698491  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:36.791795  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:36.791871  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:36.791898  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:36.839230  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:36.839319  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:36.887749  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:36.887786  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:36.920401  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:36.920436  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:36.951174  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:36.951205  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:37.001266  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:37.001300  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:37.049908  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:37.049942  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:37.111428  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:37.111468  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:39.628430  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:39.639847  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:39.639923  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:39.677579  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:39.677611  202120 cri.go:89] found id: ""
	I1202 20:02:39.677620  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:39.677677  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:39.684750  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:39.684829  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:39.720993  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:39.721019  202120 cri.go:89] found id: ""
	I1202 20:02:39.721026  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:39.721085  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:39.725234  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:39.725351  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:39.763761  202120 cri.go:89] found id: ""
	I1202 20:02:39.763783  202120 logs.go:282] 0 containers: []
	W1202 20:02:39.763791  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:39.763797  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:39.763858  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:39.790116  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:39.790136  202120 cri.go:89] found id: ""
	I1202 20:02:39.790144  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:39.790201  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:39.794352  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:39.794459  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:39.839488  202120 cri.go:89] found id: ""
	I1202 20:02:39.839568  202120 logs.go:282] 0 containers: []
	W1202 20:02:39.839591  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:39.839610  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:39.839695  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:39.886534  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:39.886599  202120 cri.go:89] found id: ""
	I1202 20:02:39.886621  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:39.886704  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:39.891027  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:39.891146  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:39.936852  202120 cri.go:89] found id: ""
	I1202 20:02:39.936932  202120 logs.go:282] 0 containers: []
	W1202 20:02:39.936957  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:39.936979  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:39.937061  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:40.001627  202120 cri.go:89] found id: ""
	I1202 20:02:40.001714  202120 logs.go:282] 0 containers: []
	W1202 20:02:40.001739  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:40.001770  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:40.001798  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:40.048034  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:40.048128  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:40.151612  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:40.151634  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:40.151650  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:40.206513  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:40.206589  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:40.244581  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:40.244614  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:40.319742  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:40.319780  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:40.338207  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:40.338237  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:40.381786  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:40.381823  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:40.418593  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:40.418626  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:42.960487  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:42.970573  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:42.970673  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:42.997422  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:42.997501  202120 cri.go:89] found id: ""
	I1202 20:02:42.997518  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:42.997595  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:43.001229  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:43.001306  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:43.029364  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:43.029438  202120 cri.go:89] found id: ""
	I1202 20:02:43.029460  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:43.029539  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:43.033325  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:43.033412  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:43.059500  202120 cri.go:89] found id: ""
	I1202 20:02:43.059531  202120 logs.go:282] 0 containers: []
	W1202 20:02:43.059540  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:43.059547  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:43.059626  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:43.087062  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:43.087086  202120 cri.go:89] found id: ""
	I1202 20:02:43.087095  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:43.087161  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:43.090845  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:43.090970  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:43.138000  202120 cri.go:89] found id: ""
	I1202 20:02:43.138033  202120 logs.go:282] 0 containers: []
	W1202 20:02:43.138071  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:43.138079  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:43.138159  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:43.187375  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:43.187452  202120 cri.go:89] found id: ""
	I1202 20:02:43.187475  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:43.187549  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:43.191771  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:43.191880  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:43.266879  202120 cri.go:89] found id: ""
	I1202 20:02:43.266950  202120 logs.go:282] 0 containers: []
	W1202 20:02:43.266972  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:43.266992  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:43.267074  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:43.300245  202120 cri.go:89] found id: ""
	I1202 20:02:43.300311  202120 logs.go:282] 0 containers: []
	W1202 20:02:43.300389  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:43.300422  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:43.300449  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:43.314009  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:43.314042  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:43.353270  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:43.353343  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:43.394959  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:43.394983  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:43.462409  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:43.462489  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:43.549316  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:43.549336  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:43.549348  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:43.615325  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:43.615359  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:43.673284  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:43.673324  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:43.722781  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:43.722814  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:46.262164  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:46.272662  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:46.272733  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:46.299261  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:46.299291  202120 cri.go:89] found id: ""
	I1202 20:02:46.299301  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:46.299362  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:46.303331  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:46.303409  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:46.331236  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:46.331260  202120 cri.go:89] found id: ""
	I1202 20:02:46.331269  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:46.331325  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:46.335217  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:46.335299  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:46.360902  202120 cri.go:89] found id: ""
	I1202 20:02:46.360925  202120 logs.go:282] 0 containers: []
	W1202 20:02:46.360933  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:46.360939  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:46.360999  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:46.387402  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:46.387423  202120 cri.go:89] found id: ""
	I1202 20:02:46.387431  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:46.387490  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:46.391388  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:46.391457  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:46.423963  202120 cri.go:89] found id: ""
	I1202 20:02:46.423993  202120 logs.go:282] 0 containers: []
	W1202 20:02:46.424002  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:46.424009  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:46.424073  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:46.450219  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:46.450240  202120 cri.go:89] found id: ""
	I1202 20:02:46.450249  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:46.450307  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:46.454139  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:46.454215  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:46.479918  202120 cri.go:89] found id: ""
	I1202 20:02:46.479941  202120 logs.go:282] 0 containers: []
	W1202 20:02:46.479950  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:46.479958  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:46.480017  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:46.505311  202120 cri.go:89] found id: ""
	I1202 20:02:46.505334  202120 logs.go:282] 0 containers: []
	W1202 20:02:46.505342  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:46.505356  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:46.505368  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:46.518259  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:46.518287  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:46.552042  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:46.552075  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:46.585269  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:46.585305  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:46.653888  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:46.653962  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:46.653988  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:46.691468  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:46.691496  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:46.724239  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:46.724271  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:46.759632  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:46.760422  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:46.810829  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:46.810911  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:49.387915  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:49.398162  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:49.398232  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:49.423883  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:49.423903  202120 cri.go:89] found id: ""
	I1202 20:02:49.423911  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:49.423967  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:49.427769  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:49.427844  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:49.457285  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:49.457308  202120 cri.go:89] found id: ""
	I1202 20:02:49.457316  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:49.457377  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:49.461167  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:49.461241  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:49.486799  202120 cri.go:89] found id: ""
	I1202 20:02:49.486822  202120 logs.go:282] 0 containers: []
	W1202 20:02:49.486838  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:49.486845  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:49.486906  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:49.512427  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:49.512447  202120 cri.go:89] found id: ""
	I1202 20:02:49.512455  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:49.512515  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:49.516523  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:49.516653  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:49.545011  202120 cri.go:89] found id: ""
	I1202 20:02:49.545034  202120 logs.go:282] 0 containers: []
	W1202 20:02:49.545043  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:49.545050  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:49.545109  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:49.569593  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:49.569617  202120 cri.go:89] found id: ""
	I1202 20:02:49.569626  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:49.569683  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:49.573751  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:49.573851  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:49.599141  202120 cri.go:89] found id: ""
	I1202 20:02:49.599162  202120 logs.go:282] 0 containers: []
	W1202 20:02:49.599171  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:49.599178  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:49.599237  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:49.624214  202120 cri.go:89] found id: ""
	I1202 20:02:49.624236  202120 logs.go:282] 0 containers: []
	W1202 20:02:49.624244  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:49.624258  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:49.624269  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:49.681870  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:49.681906  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:49.749286  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:49.749344  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:49.749370  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:49.786515  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:49.786546  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:49.819203  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:49.819236  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:49.852222  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:49.852256  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:49.865502  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:49.865535  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:49.910749  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:49.910779  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:49.949155  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:49.949233  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:52.488300  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:52.498614  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:52.498686  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:52.524278  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:52.524302  202120 cri.go:89] found id: ""
	I1202 20:02:52.524311  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:52.524397  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:52.528132  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:52.528221  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:52.553733  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:52.553755  202120 cri.go:89] found id: ""
	I1202 20:02:52.553763  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:52.553821  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:52.557519  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:52.557597  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:52.590094  202120 cri.go:89] found id: ""
	I1202 20:02:52.590117  202120 logs.go:282] 0 containers: []
	W1202 20:02:52.590127  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:52.590134  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:52.590196  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:52.621371  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:52.621401  202120 cri.go:89] found id: ""
	I1202 20:02:52.621410  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:52.621472  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:52.625326  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:52.625400  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:52.651414  202120 cri.go:89] found id: ""
	I1202 20:02:52.651440  202120 logs.go:282] 0 containers: []
	W1202 20:02:52.651450  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:52.651457  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:52.651542  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:52.678241  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:52.678271  202120 cri.go:89] found id: ""
	I1202 20:02:52.678280  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:52.678342  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:52.682190  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:52.682263  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:52.708085  202120 cri.go:89] found id: ""
	I1202 20:02:52.708110  202120 logs.go:282] 0 containers: []
	W1202 20:02:52.708125  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:52.708132  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:52.708202  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:52.734716  202120 cri.go:89] found id: ""
	I1202 20:02:52.734743  202120 logs.go:282] 0 containers: []
	W1202 20:02:52.734753  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:52.734767  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:52.734781  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:52.769377  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:52.769416  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:52.802941  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:52.802976  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:52.836569  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:52.836601  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:52.878770  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:52.878798  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:52.941761  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:52.941801  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:52.959731  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:52.959756  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:53.030361  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:53.030386  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:53.030410  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:53.065582  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:53.065616  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:55.600628  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:55.611059  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:55.611130  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:55.639945  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:55.639966  202120 cri.go:89] found id: ""
	I1202 20:02:55.639974  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:55.640031  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:55.643800  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:55.643877  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:55.670454  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:55.670476  202120 cri.go:89] found id: ""
	I1202 20:02:55.670484  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:55.670542  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:55.674507  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:55.674577  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:55.700285  202120 cri.go:89] found id: ""
	I1202 20:02:55.700310  202120 logs.go:282] 0 containers: []
	W1202 20:02:55.700342  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:55.700350  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:55.700411  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:55.725853  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:55.725881  202120 cri.go:89] found id: ""
	I1202 20:02:55.725891  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:55.725949  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:55.729839  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:55.729913  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:55.754919  202120 cri.go:89] found id: ""
	I1202 20:02:55.754943  202120 logs.go:282] 0 containers: []
	W1202 20:02:55.754951  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:55.754958  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:55.755018  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:55.787136  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:55.787159  202120 cri.go:89] found id: ""
	I1202 20:02:55.787167  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:55.787225  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:55.790868  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:55.790944  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:55.816487  202120 cri.go:89] found id: ""
	I1202 20:02:55.816513  202120 logs.go:282] 0 containers: []
	W1202 20:02:55.816522  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:55.816528  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:55.816590  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:55.842662  202120 cri.go:89] found id: ""
	I1202 20:02:55.842686  202120 logs.go:282] 0 containers: []
	W1202 20:02:55.842694  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:55.842709  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:55.842725  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:55.900984  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:55.901019  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:55.914592  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:55.914620  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:56.002330  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:56.002352  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:56.002364  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:56.037916  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:56.037951  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:56.070845  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:56.070878  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:56.112286  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:56.112313  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:56.146546  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:56.146576  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:56.201947  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:56.201979  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:58.734432  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:02:58.744906  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:02:58.744977  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:02:58.776055  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:58.776076  202120 cri.go:89] found id: ""
	I1202 20:02:58.776084  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:02:58.776164  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:58.779923  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:02:58.779997  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:02:58.805623  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:02:58.805646  202120 cri.go:89] found id: ""
	I1202 20:02:58.805655  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:02:58.805711  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:58.809637  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:02:58.809709  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:02:58.833948  202120 cri.go:89] found id: ""
	I1202 20:02:58.833970  202120 logs.go:282] 0 containers: []
	W1202 20:02:58.833978  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:02:58.833985  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:02:58.834087  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:02:58.867721  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:58.867749  202120 cri.go:89] found id: ""
	I1202 20:02:58.867757  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:02:58.867820  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:58.871666  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:02:58.871736  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:02:58.898303  202120 cri.go:89] found id: ""
	I1202 20:02:58.898326  202120 logs.go:282] 0 containers: []
	W1202 20:02:58.898335  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:02:58.898341  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:02:58.898400  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:02:58.928935  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:58.928957  202120 cri.go:89] found id: ""
	I1202 20:02:58.928964  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:02:58.929027  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:02:58.933633  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:02:58.933709  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:02:58.960266  202120 cri.go:89] found id: ""
	I1202 20:02:58.960292  202120 logs.go:282] 0 containers: []
	W1202 20:02:58.960300  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:02:58.960307  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:02:58.960396  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:02:58.995326  202120 cri.go:89] found id: ""
	I1202 20:02:58.995349  202120 logs.go:282] 0 containers: []
	W1202 20:02:58.995359  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:02:58.995374  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:02:58.995386  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:02:59.075206  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:02:59.075242  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:02:59.075257  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:02:59.107344  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:02:59.107373  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:02:59.148161  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:02:59.148196  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:02:59.180952  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:02:59.180985  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:02:59.209509  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:02:59.209538  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:02:59.268384  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:02:59.268419  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:02:59.282062  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:02:59.282090  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:02:59.315985  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:02:59.316016  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:01.849814  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:01.860841  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:01.860957  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:01.888718  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:01.888737  202120 cri.go:89] found id: ""
	I1202 20:03:01.888745  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:01.888805  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:01.892603  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:01.892674  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:01.932092  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:01.932123  202120 cri.go:89] found id: ""
	I1202 20:03:01.932132  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:01.932191  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:01.936704  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:01.936780  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:01.963598  202120 cri.go:89] found id: ""
	I1202 20:03:01.963622  202120 logs.go:282] 0 containers: []
	W1202 20:03:01.963631  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:01.963637  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:01.963694  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:01.994252  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:01.994318  202120 cri.go:89] found id: ""
	I1202 20:03:01.994338  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:01.994422  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:01.998426  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:01.998512  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:02.027127  202120 cri.go:89] found id: ""
	I1202 20:03:02.027153  202120 logs.go:282] 0 containers: []
	W1202 20:03:02.027162  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:02.027169  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:02.027237  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:02.055228  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:02.055252  202120 cri.go:89] found id: ""
	I1202 20:03:02.055261  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:02.055319  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:02.059324  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:02.059437  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:02.084986  202120 cri.go:89] found id: ""
	I1202 20:03:02.085012  202120 logs.go:282] 0 containers: []
	W1202 20:03:02.085021  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:02.085027  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:02.085091  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:02.110007  202120 cri.go:89] found id: ""
	I1202 20:03:02.110072  202120 logs.go:282] 0 containers: []
	W1202 20:03:02.110094  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:02.110116  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:02.110128  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:02.123193  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:02.123223  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:02.155542  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:02.155577  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:02.191469  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:02.191504  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:02.221362  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:02.221389  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:02.285360  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:02.285430  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:02.285450  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:02.319013  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:02.319045  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:02.353725  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:02.353757  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:02.386428  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:02.386465  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:04.948439  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:04.978404  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:04.978479  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:05.022157  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:05.022177  202120 cri.go:89] found id: ""
	I1202 20:03:05.022186  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:05.022249  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:05.029646  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:05.029782  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:05.079614  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:05.079674  202120 cri.go:89] found id: ""
	I1202 20:03:05.079706  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:05.079794  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:05.085764  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:05.085888  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:05.122610  202120 cri.go:89] found id: ""
	I1202 20:03:05.122687  202120 logs.go:282] 0 containers: []
	W1202 20:03:05.122711  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:05.122730  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:05.122823  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:05.160880  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:05.160956  202120 cri.go:89] found id: ""
	I1202 20:03:05.160988  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:05.161079  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:05.165522  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:05.165644  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:05.199375  202120 cri.go:89] found id: ""
	I1202 20:03:05.199451  202120 logs.go:282] 0 containers: []
	W1202 20:03:05.199486  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:05.199511  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:05.199607  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:05.232306  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:05.232406  202120 cri.go:89] found id: ""
	I1202 20:03:05.232428  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:05.232521  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:05.236983  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:05.237107  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:05.278737  202120 cri.go:89] found id: ""
	I1202 20:03:05.278814  202120 logs.go:282] 0 containers: []
	W1202 20:03:05.278838  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:05.278856  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:05.278968  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:05.312885  202120 cri.go:89] found id: ""
	I1202 20:03:05.312968  202120 logs.go:282] 0 containers: []
	W1202 20:03:05.312990  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:05.313017  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:05.313062  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:05.380008  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:05.380085  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:05.393338  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:05.393362  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:05.479791  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:05.479808  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:05.479821  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:05.522595  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:05.522674  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:05.579321  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:05.579396  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:05.639354  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:05.639430  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:05.694719  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:05.694796  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:05.768454  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:05.772397  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:08.345426  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:08.356101  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:08.356172  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:08.382588  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:08.382611  202120 cri.go:89] found id: ""
	I1202 20:03:08.382620  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:08.382684  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:08.386502  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:08.386579  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:08.414983  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:08.415008  202120 cri.go:89] found id: ""
	I1202 20:03:08.415018  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:08.415094  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:08.418976  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:08.419054  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:08.444661  202120 cri.go:89] found id: ""
	I1202 20:03:08.444687  202120 logs.go:282] 0 containers: []
	W1202 20:03:08.444696  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:08.444703  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:08.444766  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:08.470225  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:08.470247  202120 cri.go:89] found id: ""
	I1202 20:03:08.470255  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:08.470335  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:08.474259  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:08.474360  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:08.500615  202120 cri.go:89] found id: ""
	I1202 20:03:08.500680  202120 logs.go:282] 0 containers: []
	W1202 20:03:08.500695  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:08.500702  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:08.500771  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:08.526180  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:08.526202  202120 cri.go:89] found id: ""
	I1202 20:03:08.526210  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:08.526269  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:08.530036  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:08.530110  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:08.559052  202120 cri.go:89] found id: ""
	I1202 20:03:08.559077  202120 logs.go:282] 0 containers: []
	W1202 20:03:08.559086  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:08.559093  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:08.559151  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:08.585464  202120 cri.go:89] found id: ""
	I1202 20:03:08.585529  202120 logs.go:282] 0 containers: []
	W1202 20:03:08.585543  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:08.585561  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:08.585573  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:08.643310  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:08.643388  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:08.732527  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:08.732546  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:08.732559  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:08.780204  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:08.780238  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:08.813529  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:08.813562  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:08.848537  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:08.848569  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:08.887205  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:08.887235  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:08.900287  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:08.900316  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:08.952538  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:08.952572  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:11.498798  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:11.509566  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:11.509640  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:11.535513  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:11.535533  202120 cri.go:89] found id: ""
	I1202 20:03:11.535543  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:11.535602  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:11.539644  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:11.539724  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:11.569085  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:11.569106  202120 cri.go:89] found id: ""
	I1202 20:03:11.569114  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:11.569174  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:11.572992  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:11.573068  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:11.604113  202120 cri.go:89] found id: ""
	I1202 20:03:11.604162  202120 logs.go:282] 0 containers: []
	W1202 20:03:11.604176  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:11.604184  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:11.604256  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:11.635341  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:11.635364  202120 cri.go:89] found id: ""
	I1202 20:03:11.635373  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:11.635431  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:11.639266  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:11.639342  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:11.665704  202120 cri.go:89] found id: ""
	I1202 20:03:11.665730  202120 logs.go:282] 0 containers: []
	W1202 20:03:11.665754  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:11.665777  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:11.665852  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:11.704014  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:11.704035  202120 cri.go:89] found id: ""
	I1202 20:03:11.704049  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:11.704125  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:11.711850  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:11.711943  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:11.759116  202120 cri.go:89] found id: ""
	I1202 20:03:11.759142  202120 logs.go:282] 0 containers: []
	W1202 20:03:11.759151  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:11.759158  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:11.759221  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:11.795635  202120 cri.go:89] found id: ""
	I1202 20:03:11.795659  202120 logs.go:282] 0 containers: []
	W1202 20:03:11.795667  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:11.795680  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:11.795692  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:11.853880  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:11.853914  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:11.884837  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:11.884868  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:11.897655  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:11.897683  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:11.963162  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:11.963183  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:11.963196  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:11.998168  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:11.998205  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:12.031987  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:12.032021  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:12.069504  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:12.069534  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:12.103980  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:12.104010  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:14.637515  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:14.648161  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:14.648238  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:14.684411  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:14.684441  202120 cri.go:89] found id: ""
	I1202 20:03:14.684450  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:14.684513  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:14.688925  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:14.689000  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:14.716481  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:14.716504  202120 cri.go:89] found id: ""
	I1202 20:03:14.716513  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:14.716571  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:14.720856  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:14.720928  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:14.750641  202120 cri.go:89] found id: ""
	I1202 20:03:14.750666  202120 logs.go:282] 0 containers: []
	W1202 20:03:14.750675  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:14.750681  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:14.750742  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:14.781852  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:14.781875  202120 cri.go:89] found id: ""
	I1202 20:03:14.781883  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:14.781964  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:14.785792  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:14.785893  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:14.819750  202120 cri.go:89] found id: ""
	I1202 20:03:14.819783  202120 logs.go:282] 0 containers: []
	W1202 20:03:14.819792  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:14.819799  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:14.819868  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:14.845551  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:14.845573  202120 cri.go:89] found id: ""
	I1202 20:03:14.845581  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:14.845660  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:14.849489  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:14.849573  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:14.877292  202120 cri.go:89] found id: ""
	I1202 20:03:14.877366  202120 logs.go:282] 0 containers: []
	W1202 20:03:14.877389  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:14.877407  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:14.877501  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:14.903747  202120 cri.go:89] found id: ""
	I1202 20:03:14.903818  202120 logs.go:282] 0 containers: []
	W1202 20:03:14.903841  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:14.903869  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:14.903907  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:14.962895  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:14.962931  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:14.994846  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:14.994879  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:15.082829  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:15.082859  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:15.096496  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:15.096525  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:15.166983  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:15.167005  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:15.167018  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:15.201372  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:15.201403  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:15.241265  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:15.241299  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:15.276322  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:15.276352  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:17.809245  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:17.819408  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:17.819476  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:17.843859  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:17.843882  202120 cri.go:89] found id: ""
	I1202 20:03:17.843889  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:17.843945  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:17.847754  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:17.847850  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:17.872914  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:17.872938  202120 cri.go:89] found id: ""
	I1202 20:03:17.872946  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:17.873006  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:17.876972  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:17.877067  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:17.901645  202120 cri.go:89] found id: ""
	I1202 20:03:17.901669  202120 logs.go:282] 0 containers: []
	W1202 20:03:17.901678  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:17.901684  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:17.901746  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:17.925995  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:17.926018  202120 cri.go:89] found id: ""
	I1202 20:03:17.926026  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:17.926109  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:17.929735  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:17.929823  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:17.954950  202120 cri.go:89] found id: ""
	I1202 20:03:17.954971  202120 logs.go:282] 0 containers: []
	W1202 20:03:17.954979  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:17.954985  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:17.955048  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:17.980468  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:17.980490  202120 cri.go:89] found id: ""
	I1202 20:03:17.980499  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:17.980569  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:17.984217  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:17.984312  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:18.015793  202120 cri.go:89] found id: ""
	I1202 20:03:18.015819  202120 logs.go:282] 0 containers: []
	W1202 20:03:18.015828  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:18.015835  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:18.015899  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:18.042352  202120 cri.go:89] found id: ""
	I1202 20:03:18.042376  202120 logs.go:282] 0 containers: []
	W1202 20:03:18.042385  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:18.042415  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:18.042432  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:18.077748  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:18.077781  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:18.111084  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:18.111160  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:18.124076  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:18.124104  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:18.168774  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:18.168806  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:18.197973  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:18.198000  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:18.261189  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:18.261224  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:18.326472  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:18.326540  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:18.326563  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:18.361896  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:18.361929  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:20.903228  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:20.913687  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:20.913757  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:20.939748  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:20.939770  202120 cri.go:89] found id: ""
	I1202 20:03:20.939779  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:20.939837  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:20.943626  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:20.943706  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:20.969084  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:20.969107  202120 cri.go:89] found id: ""
	I1202 20:03:20.969115  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:20.969192  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:20.973045  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:20.973124  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:20.998763  202120 cri.go:89] found id: ""
	I1202 20:03:20.998789  202120 logs.go:282] 0 containers: []
	W1202 20:03:20.998798  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:20.998804  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:20.998869  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:21.025792  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:21.025815  202120 cri.go:89] found id: ""
	I1202 20:03:21.025823  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:21.025882  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:21.029649  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:21.029721  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:21.054530  202120 cri.go:89] found id: ""
	I1202 20:03:21.054552  202120 logs.go:282] 0 containers: []
	W1202 20:03:21.054561  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:21.054568  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:21.054628  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:21.081869  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:21.081891  202120 cri.go:89] found id: ""
	I1202 20:03:21.081899  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:21.081955  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:21.085553  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:21.085626  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:21.109584  202120 cri.go:89] found id: ""
	I1202 20:03:21.109618  202120 logs.go:282] 0 containers: []
	W1202 20:03:21.109627  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:21.109634  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:21.109720  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:21.133621  202120 cri.go:89] found id: ""
	I1202 20:03:21.133662  202120 logs.go:282] 0 containers: []
	W1202 20:03:21.133688  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:21.133707  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:21.133744  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:21.165001  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:21.165031  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:21.222323  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:21.222358  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:21.286904  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:21.286924  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:21.286937  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:21.319323  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:21.319354  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:21.352952  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:21.352983  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:21.384592  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:21.384622  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:21.413650  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:21.413685  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:21.433887  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:21.433914  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:23.989590  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:24.000496  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:24.000568  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:24.034266  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:24.034296  202120 cri.go:89] found id: ""
	I1202 20:03:24.034306  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:24.034366  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:24.038362  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:24.038442  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:24.064902  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:24.064923  202120 cri.go:89] found id: ""
	I1202 20:03:24.064931  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:24.064991  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:24.068839  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:24.068922  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:24.095012  202120 cri.go:89] found id: ""
	I1202 20:03:24.095046  202120 logs.go:282] 0 containers: []
	W1202 20:03:24.095055  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:24.095080  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:24.095166  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:24.120581  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:24.120612  202120 cri.go:89] found id: ""
	I1202 20:03:24.120621  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:24.120715  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:24.124479  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:24.124578  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:24.150485  202120 cri.go:89] found id: ""
	I1202 20:03:24.150510  202120 logs.go:282] 0 containers: []
	W1202 20:03:24.150518  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:24.150525  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:24.150584  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:24.176368  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:24.176394  202120 cri.go:89] found id: ""
	I1202 20:03:24.176403  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:24.176469  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:24.180221  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:24.180301  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:24.204404  202120 cri.go:89] found id: ""
	I1202 20:03:24.204428  202120 logs.go:282] 0 containers: []
	W1202 20:03:24.204437  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:24.204444  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:24.204534  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:24.230007  202120 cri.go:89] found id: ""
	I1202 20:03:24.230070  202120 logs.go:282] 0 containers: []
	W1202 20:03:24.230094  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:24.230114  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:24.230129  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:24.267743  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:24.267772  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:24.302481  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:24.302513  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:24.348340  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:24.348370  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:24.409207  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:24.409246  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:24.425630  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:24.425704  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:24.508163  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:24.508229  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:24.508257  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:24.544858  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:24.544889  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:24.576261  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:24.576294  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:27.118773  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:27.129221  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:27.129292  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:27.153796  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:27.153818  202120 cri.go:89] found id: ""
	I1202 20:03:27.153826  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:27.153883  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:27.157478  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:27.157550  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:27.181664  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:27.181684  202120 cri.go:89] found id: ""
	I1202 20:03:27.181692  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:27.181749  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:27.185518  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:27.185596  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:27.210506  202120 cri.go:89] found id: ""
	I1202 20:03:27.210528  202120 logs.go:282] 0 containers: []
	W1202 20:03:27.210536  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:27.210542  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:27.210602  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:27.240025  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:27.240046  202120 cri.go:89] found id: ""
	I1202 20:03:27.240055  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:27.240115  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:27.243894  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:27.243971  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:27.268445  202120 cri.go:89] found id: ""
	I1202 20:03:27.268471  202120 logs.go:282] 0 containers: []
	W1202 20:03:27.268480  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:27.268487  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:27.268545  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:27.295077  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:27.295098  202120 cri.go:89] found id: ""
	I1202 20:03:27.295106  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:27.295170  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:27.298852  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:27.298980  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:27.326808  202120 cri.go:89] found id: ""
	I1202 20:03:27.326830  202120 logs.go:282] 0 containers: []
	W1202 20:03:27.326838  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:27.326844  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:27.326903  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:27.351990  202120 cri.go:89] found id: ""
	I1202 20:03:27.352012  202120 logs.go:282] 0 containers: []
	W1202 20:03:27.352020  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:27.352035  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:27.352046  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:27.410358  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:27.410393  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:27.424419  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:27.424495  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:27.509642  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:27.509708  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:27.509743  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:27.541324  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:27.541358  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:27.573261  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:27.573296  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:27.602406  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:27.602435  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:27.640737  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:27.640769  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:27.671806  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:27.671841  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:30.207083  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:30.217644  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:30.217718  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:30.244361  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:30.244382  202120 cri.go:89] found id: ""
	I1202 20:03:30.244391  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:30.244460  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:30.248515  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:30.248592  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:30.274478  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:30.274503  202120 cri.go:89] found id: ""
	I1202 20:03:30.274511  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:30.274568  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:30.278364  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:30.278489  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:30.304048  202120 cri.go:89] found id: ""
	I1202 20:03:30.304071  202120 logs.go:282] 0 containers: []
	W1202 20:03:30.304081  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:30.304088  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:30.304149  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:30.329824  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:30.329845  202120 cri.go:89] found id: ""
	I1202 20:03:30.329853  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:30.329912  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:30.333658  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:30.333737  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:30.359378  202120 cri.go:89] found id: ""
	I1202 20:03:30.359405  202120 logs.go:282] 0 containers: []
	W1202 20:03:30.359415  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:30.359421  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:30.359544  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:30.385173  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:30.385200  202120 cri.go:89] found id: ""
	I1202 20:03:30.385209  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:30.385303  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:30.389124  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:30.389219  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:30.420929  202120 cri.go:89] found id: ""
	I1202 20:03:30.420955  202120 logs.go:282] 0 containers: []
	W1202 20:03:30.420964  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:30.420971  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:30.421079  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:30.454694  202120 cri.go:89] found id: ""
	I1202 20:03:30.454717  202120 logs.go:282] 0 containers: []
	W1202 20:03:30.454726  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:30.454739  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:30.454750  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:30.525936  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:30.525970  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:30.567288  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:30.567321  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:30.599663  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:30.599699  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:30.632248  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:30.632278  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:30.668044  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:30.668073  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:30.700995  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:30.701035  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:30.714653  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:30.714681  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:30.789845  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:30.789912  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:30.789932  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:33.323366  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:33.334073  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:33.334150  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:33.360399  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:33.360422  202120 cri.go:89] found id: ""
	I1202 20:03:33.360430  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:33.360489  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:33.364275  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:33.364374  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:33.389243  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:33.389265  202120 cri.go:89] found id: ""
	I1202 20:03:33.389273  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:33.389329  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:33.393173  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:33.393246  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:33.417575  202120 cri.go:89] found id: ""
	I1202 20:03:33.417597  202120 logs.go:282] 0 containers: []
	W1202 20:03:33.417605  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:33.417612  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:33.417668  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:33.452393  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:33.452413  202120 cri.go:89] found id: ""
	I1202 20:03:33.452421  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:33.452478  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:33.456798  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:33.456865  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:33.487270  202120 cri.go:89] found id: ""
	I1202 20:03:33.487340  202120 logs.go:282] 0 containers: []
	W1202 20:03:33.487362  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:33.487383  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:33.487474  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:33.513291  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:33.513323  202120 cri.go:89] found id: ""
	I1202 20:03:33.513331  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:33.513405  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:33.517274  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:33.517368  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:33.542569  202120 cri.go:89] found id: ""
	I1202 20:03:33.542591  202120 logs.go:282] 0 containers: []
	W1202 20:03:33.542599  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:33.542605  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:33.542664  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:33.577229  202120 cri.go:89] found id: ""
	I1202 20:03:33.577252  202120 logs.go:282] 0 containers: []
	W1202 20:03:33.577260  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:33.577274  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:33.577291  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:33.590271  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:33.590299  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:33.657384  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:33.657403  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:33.657415  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:33.690388  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:33.690419  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:33.726859  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:33.726889  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:33.782886  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:33.782915  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:33.812128  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:33.812157  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:33.874500  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:33.874538  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:33.908164  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:33.908193  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:36.442700  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:36.453589  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:36.453675  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:36.488385  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:36.488409  202120 cri.go:89] found id: ""
	I1202 20:03:36.488418  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:36.488475  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:36.492766  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:36.492837  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:36.518973  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:36.518993  202120 cri.go:89] found id: ""
	I1202 20:03:36.519001  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:36.519060  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:36.522701  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:36.522813  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:36.546636  202120 cri.go:89] found id: ""
	I1202 20:03:36.546664  202120 logs.go:282] 0 containers: []
	W1202 20:03:36.546673  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:36.546679  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:36.546743  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:36.571920  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:36.571948  202120 cri.go:89] found id: ""
	I1202 20:03:36.571957  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:36.572018  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:36.575912  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:36.576002  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:36.601239  202120 cri.go:89] found id: ""
	I1202 20:03:36.601265  202120 logs.go:282] 0 containers: []
	W1202 20:03:36.601275  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:36.601282  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:36.601344  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:36.627521  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:36.627592  202120 cri.go:89] found id: ""
	I1202 20:03:36.627615  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:36.627708  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:36.631416  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:36.631493  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:36.659300  202120 cri.go:89] found id: ""
	I1202 20:03:36.659327  202120 logs.go:282] 0 containers: []
	W1202 20:03:36.659336  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:36.659343  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:36.659458  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:36.687646  202120 cri.go:89] found id: ""
	I1202 20:03:36.687671  202120 logs.go:282] 0 containers: []
	W1202 20:03:36.687680  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:36.687719  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:36.687738  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:36.745052  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:36.745088  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:36.816940  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:36.816962  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:36.816975  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:36.849619  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:36.849655  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:36.882015  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:36.882044  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:36.895385  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:36.895423  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:36.932109  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:36.932142  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:36.968537  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:36.968570  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:37.007109  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:37.007145  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:39.546832  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:39.558530  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:39.558596  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:39.589681  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:39.589705  202120 cri.go:89] found id: ""
	I1202 20:03:39.589714  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:39.589775  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:39.593669  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:39.593750  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:39.622646  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:39.622670  202120 cri.go:89] found id: ""
	I1202 20:03:39.622677  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:39.622738  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:39.626669  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:39.626748  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:39.656651  202120 cri.go:89] found id: ""
	I1202 20:03:39.656678  202120 logs.go:282] 0 containers: []
	W1202 20:03:39.656687  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:39.656693  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:39.656758  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:39.684166  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:39.684191  202120 cri.go:89] found id: ""
	I1202 20:03:39.684199  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:39.684291  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:39.688070  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:39.688146  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:39.713148  202120 cri.go:89] found id: ""
	I1202 20:03:39.713174  202120 logs.go:282] 0 containers: []
	W1202 20:03:39.713183  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:39.713190  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:39.713249  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:39.738391  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:39.738413  202120 cri.go:89] found id: ""
	I1202 20:03:39.738422  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:39.738480  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:39.742254  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:39.742325  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:39.773598  202120 cri.go:89] found id: ""
	I1202 20:03:39.773673  202120 logs.go:282] 0 containers: []
	W1202 20:03:39.773689  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:39.773697  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:39.773767  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:39.801943  202120 cri.go:89] found id: ""
	I1202 20:03:39.801979  202120 logs.go:282] 0 containers: []
	W1202 20:03:39.801989  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:39.802004  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:39.802019  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:39.839520  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:39.839551  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:39.873599  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:39.873632  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:39.908184  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:39.908214  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:39.946930  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:39.946962  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:40.005537  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:40.005578  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:40.026349  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:40.026378  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:40.096913  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:40.096974  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:40.097006  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:40.153093  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:40.153132  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:42.708434  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:42.720770  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:42.720842  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:42.779445  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:42.779466  202120 cri.go:89] found id: ""
	I1202 20:03:42.779474  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:42.779582  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:42.784424  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:42.784494  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:42.822111  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:42.822131  202120 cri.go:89] found id: ""
	I1202 20:03:42.822138  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:42.822195  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:42.826801  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:42.826872  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:42.862588  202120 cri.go:89] found id: ""
	I1202 20:03:42.862609  202120 logs.go:282] 0 containers: []
	W1202 20:03:42.862617  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:42.862623  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:42.862688  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:42.909756  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:42.909777  202120 cri.go:89] found id: ""
	I1202 20:03:42.909786  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:42.909847  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:42.914443  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:42.914517  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:42.945825  202120 cri.go:89] found id: ""
	I1202 20:03:42.945848  202120 logs.go:282] 0 containers: []
	W1202 20:03:42.945856  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:42.945866  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:42.945925  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:42.984009  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:42.984033  202120 cri.go:89] found id: ""
	I1202 20:03:42.984042  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:42.984097  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:42.995885  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:42.995974  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:43.039688  202120 cri.go:89] found id: ""
	I1202 20:03:43.039717  202120 logs.go:282] 0 containers: []
	W1202 20:03:43.039726  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:43.039733  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:43.039791  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:43.071179  202120 cri.go:89] found id: ""
	I1202 20:03:43.071199  202120 logs.go:282] 0 containers: []
	W1202 20:03:43.071208  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:43.071221  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:43.071233  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:43.146070  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:43.146268  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:43.236682  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:43.236704  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:43.236717  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:43.272864  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:43.272892  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:43.303868  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:43.303905  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:43.333743  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:43.333769  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:43.346440  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:43.346469  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:43.392030  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:43.392061  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:43.426082  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:43.426109  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:45.974832  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:45.985181  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:45.985250  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:46.021758  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:46.021780  202120 cri.go:89] found id: ""
	I1202 20:03:46.021788  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:46.021851  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:46.026655  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:46.026736  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:46.057464  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:46.057495  202120 cri.go:89] found id: ""
	I1202 20:03:46.057504  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:46.057575  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:46.062212  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:46.062281  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:46.097739  202120 cri.go:89] found id: ""
	I1202 20:03:46.097827  202120 logs.go:282] 0 containers: []
	W1202 20:03:46.097855  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:46.097873  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:46.097973  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:46.145247  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:46.145266  202120 cri.go:89] found id: ""
	I1202 20:03:46.145274  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:46.145332  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:46.150057  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:46.150129  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:46.212181  202120 cri.go:89] found id: ""
	I1202 20:03:46.212202  202120 logs.go:282] 0 containers: []
	W1202 20:03:46.212210  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:46.212216  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:46.212275  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:46.283414  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:46.283435  202120 cri.go:89] found id: ""
	I1202 20:03:46.283443  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:46.283505  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:46.287605  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:46.287677  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:46.326390  202120 cri.go:89] found id: ""
	I1202 20:03:46.326412  202120 logs.go:282] 0 containers: []
	W1202 20:03:46.326420  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:46.326426  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:46.326486  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:46.376424  202120 cri.go:89] found id: ""
	I1202 20:03:46.376450  202120 logs.go:282] 0 containers: []
	W1202 20:03:46.376459  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:46.376474  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:46.376486  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:46.446494  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:46.446563  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:46.460559  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:46.460584  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:46.552671  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:46.552691  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:46.552703  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:46.606640  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:46.606871  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:46.666074  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:46.666597  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:46.707833  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:46.707925  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:46.747850  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:46.747980  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:46.796805  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:46.796928  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:49.335777  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:49.346482  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:49.346553  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:49.383776  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:49.383797  202120 cri.go:89] found id: ""
	I1202 20:03:49.383805  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:49.383861  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:49.388419  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:49.388491  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:49.427172  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:49.427191  202120 cri.go:89] found id: ""
	I1202 20:03:49.427199  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:49.427256  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:49.431685  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:49.431808  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:49.465346  202120 cri.go:89] found id: ""
	I1202 20:03:49.465368  202120 logs.go:282] 0 containers: []
	W1202 20:03:49.465375  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:49.465383  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:49.465441  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:49.492349  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:49.492370  202120 cri.go:89] found id: ""
	I1202 20:03:49.492378  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:49.492434  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:49.497287  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:49.497359  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:49.532642  202120 cri.go:89] found id: ""
	I1202 20:03:49.532716  202120 logs.go:282] 0 containers: []
	W1202 20:03:49.532740  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:49.532761  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:49.532845  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:49.561969  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:49.562040  202120 cri.go:89] found id: ""
	I1202 20:03:49.562063  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:49.562155  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:49.566650  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:49.566770  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:49.598774  202120 cri.go:89] found id: ""
	I1202 20:03:49.598846  202120 logs.go:282] 0 containers: []
	W1202 20:03:49.598869  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:49.598890  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:49.598976  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:49.631563  202120 cri.go:89] found id: ""
	I1202 20:03:49.631638  202120 logs.go:282] 0 containers: []
	W1202 20:03:49.631662  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:49.631692  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:49.631736  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:49.698192  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:49.698244  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:49.791979  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:49.792051  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:49.792079  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:49.829483  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:49.829516  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:49.895799  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:49.896224  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:49.976734  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:49.976815  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:50.003920  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:50.003996  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:50.079037  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:50.079113  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:50.126697  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:50.126737  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:52.666640  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:52.677062  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:52.677141  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:52.702925  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:52.702944  202120 cri.go:89] found id: ""
	I1202 20:03:52.702953  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:52.703010  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:52.706872  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:52.706940  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:52.735058  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:52.735078  202120 cri.go:89] found id: ""
	I1202 20:03:52.735086  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:52.735146  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:52.739002  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:52.739070  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:52.773921  202120 cri.go:89] found id: ""
	I1202 20:03:52.773946  202120 logs.go:282] 0 containers: []
	W1202 20:03:52.773956  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:52.773962  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:52.774025  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:52.800211  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:52.800235  202120 cri.go:89] found id: ""
	I1202 20:03:52.800250  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:52.800309  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:52.804685  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:52.804766  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:52.830643  202120 cri.go:89] found id: ""
	I1202 20:03:52.830668  202120 logs.go:282] 0 containers: []
	W1202 20:03:52.830678  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:52.830685  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:52.830744  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:52.861427  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:52.861454  202120 cri.go:89] found id: ""
	I1202 20:03:52.861463  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:52.861523  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:52.865440  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:52.865513  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:52.892749  202120 cri.go:89] found id: ""
	I1202 20:03:52.892828  202120 logs.go:282] 0 containers: []
	W1202 20:03:52.892852  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:52.892865  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:52.892949  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:52.928252  202120 cri.go:89] found id: ""
	I1202 20:03:52.928279  202120 logs.go:282] 0 containers: []
	W1202 20:03:52.928289  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:52.928337  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:52.928357  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:52.944263  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:52.944294  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:52.987119  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:52.987153  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:53.023225  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:53.023256  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:53.069719  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:53.069754  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:53.112851  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:53.112892  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:53.175798  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:53.175885  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:53.242728  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:53.242764  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:53.350279  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:53.350303  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:53.350317  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:55.885312  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:55.895446  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:55.895519  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:55.925153  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:55.925173  202120 cri.go:89] found id: ""
	I1202 20:03:55.925181  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:55.925242  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:55.929760  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:55.929835  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:55.961181  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:55.961200  202120 cri.go:89] found id: ""
	I1202 20:03:55.961208  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:55.961265  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:55.965381  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:55.965449  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:55.996901  202120 cri.go:89] found id: ""
	I1202 20:03:55.996979  202120 logs.go:282] 0 containers: []
	W1202 20:03:55.997002  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:55.997016  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:55.997093  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:56.025260  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:56.025279  202120 cri.go:89] found id: ""
	I1202 20:03:56.025287  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:56.025348  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:56.029438  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:56.029551  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:56.055574  202120 cri.go:89] found id: ""
	I1202 20:03:56.055602  202120 logs.go:282] 0 containers: []
	W1202 20:03:56.055613  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:56.055620  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:56.055685  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:56.083382  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:56.083406  202120 cri.go:89] found id: ""
	I1202 20:03:56.083415  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:56.083481  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:56.087179  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:56.087275  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:56.116837  202120 cri.go:89] found id: ""
	I1202 20:03:56.116860  202120 logs.go:282] 0 containers: []
	W1202 20:03:56.116868  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:56.116891  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:56.116973  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:56.141622  202120 cri.go:89] found id: ""
	I1202 20:03:56.141699  202120 logs.go:282] 0 containers: []
	W1202 20:03:56.141721  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:56.141747  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:56.141765  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:56.175154  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:56.175185  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:56.235390  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:56.235430  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:56.248484  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:56.248512  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:56.311778  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:56.311800  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:56.311813  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:56.354589  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:56.354619  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:56.388332  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:56.388419  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:56.424531  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:56.424562  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:56.456366  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:56.456463  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:03:58.988846  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:03:58.998956  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:03:58.999028  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:03:59.024980  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:59.025003  202120 cri.go:89] found id: ""
	I1202 20:03:59.025010  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:03:59.025077  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:59.028826  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:03:59.028903  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:03:59.055331  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:59.055356  202120 cri.go:89] found id: ""
	I1202 20:03:59.055364  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:03:59.055423  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:59.059510  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:03:59.059583  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:03:59.087171  202120 cri.go:89] found id: ""
	I1202 20:03:59.087198  202120 logs.go:282] 0 containers: []
	W1202 20:03:59.087207  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:03:59.087214  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:03:59.087275  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:03:59.113814  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:59.113838  202120 cri.go:89] found id: ""
	I1202 20:03:59.113846  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:03:59.113907  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:59.117537  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:03:59.117611  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:03:59.143409  202120 cri.go:89] found id: ""
	I1202 20:03:59.143430  202120 logs.go:282] 0 containers: []
	W1202 20:03:59.143438  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:03:59.143444  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:03:59.143502  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:03:59.171341  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:59.171366  202120 cri.go:89] found id: ""
	I1202 20:03:59.171374  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:03:59.171431  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:03:59.176713  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:03:59.176800  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:03:59.204501  202120 cri.go:89] found id: ""
	I1202 20:03:59.204525  202120 logs.go:282] 0 containers: []
	W1202 20:03:59.204532  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:03:59.204542  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:03:59.204603  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:03:59.233334  202120 cri.go:89] found id: ""
	I1202 20:03:59.233357  202120 logs.go:282] 0 containers: []
	W1202 20:03:59.233365  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:03:59.233379  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:03:59.233390  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:03:59.294952  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:03:59.294987  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:03:59.337921  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:03:59.337955  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:03:59.370519  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:03:59.370550  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:03:59.402985  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:03:59.403019  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:03:59.437700  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:03:59.437735  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:03:59.469957  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:03:59.469990  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:03:59.482872  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:03:59.482900  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:03:59.545115  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:03:59.545137  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:03:59.545149  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:04:02.078457  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:04:02.091157  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:04:02.091229  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:04:02.121799  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:04:02.121825  202120 cri.go:89] found id: ""
	I1202 20:04:02.121833  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:04:02.121912  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:04:02.126983  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:04:02.127089  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:04:02.157526  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:04:02.157548  202120 cri.go:89] found id: ""
	I1202 20:04:02.157557  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:04:02.157627  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:04:02.161517  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:04:02.161607  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:04:02.190107  202120 cri.go:89] found id: ""
	I1202 20:04:02.190137  202120 logs.go:282] 0 containers: []
	W1202 20:04:02.190146  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:04:02.190152  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:04:02.190213  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:04:02.215919  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:04:02.215944  202120 cri.go:89] found id: ""
	I1202 20:04:02.215954  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:04:02.216025  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:04:02.219835  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:04:02.219934  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:04:02.247544  202120 cri.go:89] found id: ""
	I1202 20:04:02.247570  202120 logs.go:282] 0 containers: []
	W1202 20:04:02.247582  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:04:02.247589  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:04:02.247649  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:04:02.274800  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:04:02.274823  202120 cri.go:89] found id: ""
	I1202 20:04:02.274832  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:04:02.274890  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:04:02.278665  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:04:02.278750  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:04:02.304464  202120 cri.go:89] found id: ""
	I1202 20:04:02.304543  202120 logs.go:282] 0 containers: []
	W1202 20:04:02.304576  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:04:02.304591  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:04:02.304665  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:04:02.330293  202120 cri.go:89] found id: ""
	I1202 20:04:02.330318  202120 logs.go:282] 0 containers: []
	W1202 20:04:02.330328  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:04:02.330342  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:04:02.330353  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:04:02.397762  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:04:02.397783  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:04:02.397795  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:04:02.431565  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:04:02.431597  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:04:02.466505  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:04:02.466539  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:04:02.499209  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:04:02.499241  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:04:02.529104  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:04:02.529131  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:04:02.589721  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:04:02.589755  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:04:02.602504  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:04:02.602533  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:04:02.634479  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:04:02.634507  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:04:05.171873  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:04:05.183010  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:04:05.183089  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:04:05.209176  202120 cri.go:89] found id: "18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:04:05.209200  202120 cri.go:89] found id: ""
	I1202 20:04:05.209208  202120 logs.go:282] 1 containers: [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92]
	I1202 20:04:05.209266  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:04:05.212914  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:04:05.213026  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:04:05.239596  202120 cri.go:89] found id: "f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:04:05.239620  202120 cri.go:89] found id: ""
	I1202 20:04:05.239628  202120 logs.go:282] 1 containers: [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53]
	I1202 20:04:05.239683  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:04:05.243383  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:04:05.243454  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:04:05.269483  202120 cri.go:89] found id: ""
	I1202 20:04:05.269518  202120 logs.go:282] 0 containers: []
	W1202 20:04:05.269528  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:04:05.269534  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:04:05.269614  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:04:05.295333  202120 cri.go:89] found id: "80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:04:05.295352  202120 cri.go:89] found id: ""
	I1202 20:04:05.295359  202120 logs.go:282] 1 containers: [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3]
	I1202 20:04:05.295415  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:04:05.299216  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:04:05.299287  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:04:05.326288  202120 cri.go:89] found id: ""
	I1202 20:04:05.326311  202120 logs.go:282] 0 containers: []
	W1202 20:04:05.326320  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:04:05.326326  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:04:05.326387  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:04:05.352144  202120 cri.go:89] found id: "86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:04:05.352181  202120 cri.go:89] found id: ""
	I1202 20:04:05.352195  202120 logs.go:282] 1 containers: [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992]
	I1202 20:04:05.352265  202120 ssh_runner.go:195] Run: which crictl
	I1202 20:04:05.356053  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:04:05.356126  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:04:05.383211  202120 cri.go:89] found id: ""
	I1202 20:04:05.383248  202120 logs.go:282] 0 containers: []
	W1202 20:04:05.383258  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:04:05.383265  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:04:05.383350  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:04:05.419242  202120 cri.go:89] found id: ""
	I1202 20:04:05.419281  202120 logs.go:282] 0 containers: []
	W1202 20:04:05.419291  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:04:05.419308  202120 logs.go:123] Gathering logs for kube-scheduler [80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3] ...
	I1202 20:04:05.419324  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3"
	I1202 20:04:05.452053  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:04:05.452084  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:04:05.485205  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:04:05.485239  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I1202 20:04:05.547497  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:04:05.547531  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:04:05.560911  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:04:05.560938  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:04:05.627018  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:04:05.627037  202120 logs.go:123] Gathering logs for kube-controller-manager [86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992] ...
	I1202 20:04:05.627050  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992"
	I1202 20:04:05.675135  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:04:05.675168  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:04:05.713735  202120 logs.go:123] Gathering logs for kube-apiserver [18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92] ...
	I1202 20:04:05.713766  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92"
	I1202 20:04:05.771604  202120 logs.go:123] Gathering logs for etcd [f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53] ...
	I1202 20:04:05.771636  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/local/bin/crictl logs --tail 400 f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53"
	I1202 20:04:08.305076  202120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 20:04:08.315338  202120 kubeadm.go:602] duration metric: took 4m3.283507989s to restartPrimaryControlPlane
	W1202 20:04:08.315407  202120 out.go:285] ! Unable to restart control-plane node(s), will reset cluster: <no value>
	! Unable to restart control-plane node(s), will reset cluster: <no value>
	I1202 20:04:08.315468  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 20:04:08.806440  202120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 20:04:08.827341  202120 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 20:04:08.838000  202120 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 20:04:08.838071  202120 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 20:04:08.850581  202120 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 20:04:08.850602  202120 kubeadm.go:158] found existing configuration files:
	
	I1202 20:04:08.850665  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1202 20:04:08.860306  202120 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 20:04:08.860467  202120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 20:04:08.872187  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1202 20:04:08.886148  202120 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 20:04:08.886216  202120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 20:04:08.894927  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1202 20:04:08.904311  202120 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 20:04:08.904395  202120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 20:04:08.912864  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1202 20:04:08.923173  202120 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 20:04:08.923249  202120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 20:04:08.934820  202120 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 20:04:08.989086  202120 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 20:04:08.989461  202120 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 20:04:09.090446  202120 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 20:04:09.090549  202120 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 20:04:09.090595  202120 kubeadm.go:319] OS: Linux
	I1202 20:04:09.090654  202120 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 20:04:09.090713  202120 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 20:04:09.090775  202120 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 20:04:09.090847  202120 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 20:04:09.090909  202120 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 20:04:09.090966  202120 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 20:04:09.091019  202120 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 20:04:09.091074  202120 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 20:04:09.091128  202120 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 20:04:09.185867  202120 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 20:04:09.185984  202120 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 20:04:09.186087  202120 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 20:04:09.192673  202120 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 20:04:09.198106  202120 out.go:252]   - Generating certificates and keys ...
	I1202 20:04:09.198207  202120 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 20:04:09.198299  202120 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 20:04:09.198392  202120 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 20:04:09.198476  202120 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 20:04:09.198567  202120 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 20:04:09.198635  202120 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 20:04:09.198731  202120 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 20:04:09.198834  202120 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 20:04:09.198936  202120 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 20:04:09.199029  202120 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 20:04:09.210620  202120 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 20:04:09.210689  202120 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 20:04:09.372653  202120 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 20:04:09.533394  202120 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 20:04:09.738749  202120 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 20:04:09.824236  202120 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 20:04:09.957496  202120 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 20:04:09.964733  202120 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 20:04:09.971381  202120 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 20:04:09.975242  202120 out.go:252]   - Booting up control plane ...
	I1202 20:04:09.975360  202120 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 20:04:09.976740  202120 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 20:04:09.977503  202120 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 20:04:10.005237  202120 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 20:04:10.008911  202120 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 20:04:10.021613  202120 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 20:04:10.021710  202120 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 20:04:10.021748  202120 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 20:04:10.264759  202120 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 20:04:10.264885  202120 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 20:08:10.256035  202120 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.001050732s
	I1202 20:08:10.256068  202120 kubeadm.go:319] 
	I1202 20:08:10.256125  202120 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 20:08:10.256160  202120 kubeadm.go:319] 	- The kubelet is not running
	I1202 20:08:10.256264  202120 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 20:08:10.256271  202120 kubeadm.go:319] 
	I1202 20:08:10.256392  202120 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 20:08:10.256425  202120 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 20:08:10.256457  202120 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 20:08:10.256461  202120 kubeadm.go:319] 
	I1202 20:08:10.260817  202120 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 20:08:10.261227  202120 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 20:08:10.261336  202120 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 20:08:10.261560  202120 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 20:08:10.261569  202120 kubeadm.go:319] 
	I1202 20:08:10.261635  202120 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	W1202 20:08:10.261749  202120 out.go:285] ! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001050732s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	! initialization failed, will try again: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.001050732s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	I1202 20:08:10.261834  202120 ssh_runner.go:195] Run: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm reset --cri-socket /run/containerd/containerd.sock --force"
	I1202 20:08:10.671954  202120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 20:08:10.685558  202120 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 20:08:10.685632  202120 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 20:08:10.693979  202120 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 20:08:10.694001  202120 kubeadm.go:158] found existing configuration files:
	
	I1202 20:08:10.694056  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1202 20:08:10.702092  202120 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 20:08:10.702154  202120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 20:08:10.710168  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1202 20:08:10.718706  202120 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 20:08:10.718781  202120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 20:08:10.727500  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1202 20:08:10.735954  202120 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 20:08:10.736025  202120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 20:08:10.743717  202120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1202 20:08:10.752933  202120 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 20:08:10.753000  202120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 20:08:10.760515  202120 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 20:08:10.798827  202120 kubeadm.go:319] [init] Using Kubernetes version: v1.35.0-beta.0
	I1202 20:08:10.798919  202120 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 20:08:10.872809  202120 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 20:08:10.872889  202120 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 20:08:10.872930  202120 kubeadm.go:319] OS: Linux
	I1202 20:08:10.872979  202120 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 20:08:10.873030  202120 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 20:08:10.873081  202120 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 20:08:10.873133  202120 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 20:08:10.873184  202120 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 20:08:10.873236  202120 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 20:08:10.873286  202120 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 20:08:10.873337  202120 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 20:08:10.873387  202120 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 20:08:10.939738  202120 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 20:08:10.939854  202120 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 20:08:10.939981  202120 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 20:08:10.948747  202120 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 20:08:10.952575  202120 out.go:252]   - Generating certificates and keys ...
	I1202 20:08:10.952749  202120 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 20:08:10.952849  202120 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 20:08:10.952962  202120 kubeadm.go:319] [certs] Using existing apiserver-kubelet-client certificate and key on disk
	I1202 20:08:10.953049  202120 kubeadm.go:319] [certs] Using existing front-proxy-ca certificate authority
	I1202 20:08:10.953159  202120 kubeadm.go:319] [certs] Using existing front-proxy-client certificate and key on disk
	I1202 20:08:10.953256  202120 kubeadm.go:319] [certs] Using existing etcd/ca certificate authority
	I1202 20:08:10.953349  202120 kubeadm.go:319] [certs] Using existing etcd/server certificate and key on disk
	I1202 20:08:10.953449  202120 kubeadm.go:319] [certs] Using existing etcd/peer certificate and key on disk
	I1202 20:08:10.953554  202120 kubeadm.go:319] [certs] Using existing etcd/healthcheck-client certificate and key on disk
	I1202 20:08:10.953656  202120 kubeadm.go:319] [certs] Using existing apiserver-etcd-client certificate and key on disk
	I1202 20:08:10.953717  202120 kubeadm.go:319] [certs] Using the existing "sa" key
	I1202 20:08:10.953805  202120 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 20:08:11.197496  202120 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 20:08:11.362741  202120 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 20:08:11.739211  202120 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 20:08:11.901743  202120 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 20:08:12.065745  202120 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 20:08:12.066310  202120 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 20:08:12.070709  202120 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 20:08:12.074110  202120 out.go:252]   - Booting up control plane ...
	I1202 20:08:12.074222  202120 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 20:08:12.074306  202120 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 20:08:12.074378  202120 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 20:08:12.093592  202120 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 20:08:12.094106  202120 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 20:08:12.101724  202120 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 20:08:12.102005  202120 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 20:08:12.102194  202120 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 20:08:12.226233  202120 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 20:08:12.226359  202120 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 20:12:12.226574  202120 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000313622s
	I1202 20:12:12.226609  202120 kubeadm.go:319] 
	I1202 20:12:12.226667  202120 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 20:12:12.226700  202120 kubeadm.go:319] 	- The kubelet is not running
	I1202 20:12:12.226805  202120 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 20:12:12.226811  202120 kubeadm.go:319] 
	I1202 20:12:12.226915  202120 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 20:12:12.226947  202120 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 20:12:12.226977  202120 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 20:12:12.226981  202120 kubeadm.go:319] 
	I1202 20:12:12.231714  202120 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 20:12:12.232192  202120 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 20:12:12.232313  202120 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 20:12:12.232580  202120 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 20:12:12.232591  202120 kubeadm.go:319] 
	I1202 20:12:12.232661  202120 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1202 20:12:12.232725  202120 kubeadm.go:403] duration metric: took 12m7.280851867s to StartCluster
	I1202 20:12:12.232775  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:12:12.232852  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:12:12.259777  202120 cri.go:89] found id: ""
	I1202 20:12:12.259803  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.259812  202120 logs.go:284] No container was found matching "kube-apiserver"
	I1202 20:12:12.259820  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:12:12.259885  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:12:12.286765  202120 cri.go:89] found id: ""
	I1202 20:12:12.286791  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.286799  202120 logs.go:284] No container was found matching "etcd"
	I1202 20:12:12.286806  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:12:12.286865  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:12:12.310876  202120 cri.go:89] found id: ""
	I1202 20:12:12.310911  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.310919  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:12:12.310926  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:12:12.310986  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:12:12.342125  202120 cri.go:89] found id: ""
	I1202 20:12:12.342148  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.342157  202120 logs.go:284] No container was found matching "kube-scheduler"
	I1202 20:12:12.342163  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:12:12.342222  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:12:12.369246  202120 cri.go:89] found id: ""
	I1202 20:12:12.369270  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.369279  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:12:12.369286  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:12:12.369346  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:12:12.400088  202120 cri.go:89] found id: ""
	I1202 20:12:12.400111  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.400120  202120 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 20:12:12.400126  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:12:12.400184  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:12:12.431257  202120 cri.go:89] found id: ""
	I1202 20:12:12.431280  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.431289  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:12:12.431295  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:12:12.431354  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:12:12.461161  202120 cri.go:89] found id: ""
	I1202 20:12:12.461183  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.461191  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:12:12.461201  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:12:12.461213  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:12:12.476068  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:12:12.476091  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:12:12.547327  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:12:12.547349  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:12:12.547361  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:12:12.586389  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:12:12.586422  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:12:12.618762  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:12:12.618789  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1202 20:12:12.679897  202120 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000313622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1202 20:12:12.679958  202120 out.go:285] * 
	* 
	W1202 20:12:12.680015  202120 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000313622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000313622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 20:12:12.680033  202120 out.go:285] * 
	* 
	W1202 20:12:12.682162  202120 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 20:12:12.687985  202120 out.go:203] 
	W1202 20:12:12.691983  202120 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000313622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000313622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 20:12:12.692054  202120 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	* Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1202 20:12:12.692078  202120 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	* Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1202 20:12:12.695545  202120 out.go:203] 

                                                
                                                
** /stderr **
version_upgrade_test.go:245: failed to upgrade with newest k8s version. args: out/minikube-linux-arm64 start -p kubernetes-upgrade-685093 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd : exit status 109
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-685093 version --output=json
version_upgrade_test.go:248: (dbg) Non-zero exit: kubectl --context kubernetes-upgrade-685093 version --output=json: exit status 1 (85.744857ms)

                                                
                                                
-- stdout --
	{
	  "clientVersion": {
	    "major": "1",
	    "minor": "33",
	    "gitVersion": "v1.33.2",
	    "gitCommit": "a57b6f7709f6c2722b92f07b8b4c48210a51fc40",
	    "gitTreeState": "clean",
	    "buildDate": "2025-06-17T18:41:31Z",
	    "goVersion": "go1.24.4",
	    "compiler": "gc",
	    "platform": "linux/arm64"
	  },
	  "kustomizeVersion": "v5.6.0"
	}

                                                
                                                
-- /stdout --
** stderr ** 
	The connection to the server 192.168.76.2:8443 was refused - did you specify the right host or port?

                                                
                                                
** /stderr **
version_upgrade_test.go:250: error running kubectl: exit status 1
panic.go:615: *** TestKubernetesUpgrade FAILED at 2025-12-02 20:12:13.468039667 +0000 UTC m=+5065.641432220
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:223: ======>  post-mortem[TestKubernetesUpgrade]: network settings <======
helpers_test.go:230: HOST ENV snapshots: PROXY env: HTTP_PROXY="<empty>" HTTPS_PROXY="<empty>" NO_PROXY="<empty>"
helpers_test.go:238: ======>  post-mortem[TestKubernetesUpgrade]: docker inspect <======
helpers_test.go:239: (dbg) Run:  docker inspect kubernetes-upgrade-685093
helpers_test.go:243: (dbg) docker inspect kubernetes-upgrade-685093:

                                                
                                                
-- stdout --
	[
	    {
	        "Id": "62852c3ac88042833a3988618f01fcc0f5aca80028478cc3d470a570b36a4c19",
	        "Created": "2025-12-02T19:59:11.827526396Z",
	        "Path": "/usr/local/bin/entrypoint",
	        "Args": [
	            "/sbin/init"
	        ],
	        "State": {
	            "Status": "running",
	            "Running": true,
	            "Paused": false,
	            "Restarting": false,
	            "OOMKilled": false,
	            "Dead": false,
	            "Pid": 202255,
	            "ExitCode": 0,
	            "Error": "",
	            "StartedAt": "2025-12-02T19:59:45.982321396Z",
	            "FinishedAt": "2025-12-02T19:59:44.826841001Z"
	        },
	        "Image": "sha256:ac919894123858c63a6b115b7a0677e38aafc32ba4f00c3ebbd7c61e958451be",
	        "ResolvConfPath": "/var/lib/docker/containers/62852c3ac88042833a3988618f01fcc0f5aca80028478cc3d470a570b36a4c19/resolv.conf",
	        "HostnamePath": "/var/lib/docker/containers/62852c3ac88042833a3988618f01fcc0f5aca80028478cc3d470a570b36a4c19/hostname",
	        "HostsPath": "/var/lib/docker/containers/62852c3ac88042833a3988618f01fcc0f5aca80028478cc3d470a570b36a4c19/hosts",
	        "LogPath": "/var/lib/docker/containers/62852c3ac88042833a3988618f01fcc0f5aca80028478cc3d470a570b36a4c19/62852c3ac88042833a3988618f01fcc0f5aca80028478cc3d470a570b36a4c19-json.log",
	        "Name": "/kubernetes-upgrade-685093",
	        "RestartCount": 0,
	        "Driver": "overlay2",
	        "Platform": "linux",
	        "MountLabel": "",
	        "ProcessLabel": "",
	        "AppArmorProfile": "unconfined",
	        "ExecIDs": null,
	        "HostConfig": {
	            "Binds": [
	                "/lib/modules:/lib/modules:ro",
	                "kubernetes-upgrade-685093:/var"
	            ],
	            "ContainerIDFile": "",
	            "LogConfig": {
	                "Type": "json-file",
	                "Config": {}
	            },
	            "NetworkMode": "kubernetes-upgrade-685093",
	            "PortBindings": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": ""
	                    }
	                ]
	            },
	            "RestartPolicy": {
	                "Name": "no",
	                "MaximumRetryCount": 0
	            },
	            "AutoRemove": false,
	            "VolumeDriver": "",
	            "VolumesFrom": null,
	            "ConsoleSize": [
	                0,
	                0
	            ],
	            "CapAdd": null,
	            "CapDrop": null,
	            "CgroupnsMode": "host",
	            "Dns": [],
	            "DnsOptions": [],
	            "DnsSearch": [],
	            "ExtraHosts": null,
	            "GroupAdd": null,
	            "IpcMode": "private",
	            "Cgroup": "",
	            "Links": null,
	            "OomScoreAdj": 0,
	            "PidMode": "",
	            "Privileged": true,
	            "PublishAllPorts": false,
	            "ReadonlyRootfs": false,
	            "SecurityOpt": [
	                "seccomp=unconfined",
	                "apparmor=unconfined",
	                "label=disable"
	            ],
	            "Tmpfs": {
	                "/run": "",
	                "/tmp": ""
	            },
	            "UTSMode": "",
	            "UsernsMode": "",
	            "ShmSize": 67108864,
	            "Runtime": "runc",
	            "Isolation": "",
	            "CpuShares": 0,
	            "Memory": 3221225472,
	            "NanoCpus": 2000000000,
	            "CgroupParent": "",
	            "BlkioWeight": 0,
	            "BlkioWeightDevice": [],
	            "BlkioDeviceReadBps": [],
	            "BlkioDeviceWriteBps": [],
	            "BlkioDeviceReadIOps": [],
	            "BlkioDeviceWriteIOps": [],
	            "CpuPeriod": 0,
	            "CpuQuota": 0,
	            "CpuRealtimePeriod": 0,
	            "CpuRealtimeRuntime": 0,
	            "CpusetCpus": "",
	            "CpusetMems": "",
	            "Devices": [],
	            "DeviceCgroupRules": null,
	            "DeviceRequests": null,
	            "MemoryReservation": 0,
	            "MemorySwap": 6442450944,
	            "MemorySwappiness": null,
	            "OomKillDisable": false,
	            "PidsLimit": null,
	            "Ulimits": [],
	            "CpuCount": 0,
	            "CpuPercent": 0,
	            "IOMaximumIOps": 0,
	            "IOMaximumBandwidth": 0,
	            "MaskedPaths": null,
	            "ReadonlyPaths": null
	        },
	        "GraphDriver": {
	            "Data": {
	                "ID": "62852c3ac88042833a3988618f01fcc0f5aca80028478cc3d470a570b36a4c19",
	                "LowerDir": "/var/lib/docker/overlay2/41299d59430bae9d4f379a372e37ca6a00f7e60e33d47b76a79ef69e3585e3fc-init/diff:/var/lib/docker/overlay2/a59c61675ee48e07a7f4a8725bd393449453344ad8907963779ea1c0059d936c/diff",
	                "MergedDir": "/var/lib/docker/overlay2/41299d59430bae9d4f379a372e37ca6a00f7e60e33d47b76a79ef69e3585e3fc/merged",
	                "UpperDir": "/var/lib/docker/overlay2/41299d59430bae9d4f379a372e37ca6a00f7e60e33d47b76a79ef69e3585e3fc/diff",
	                "WorkDir": "/var/lib/docker/overlay2/41299d59430bae9d4f379a372e37ca6a00f7e60e33d47b76a79ef69e3585e3fc/work"
	            },
	            "Name": "overlay2"
	        },
	        "Mounts": [
	            {
	                "Type": "volume",
	                "Name": "kubernetes-upgrade-685093",
	                "Source": "/var/lib/docker/volumes/kubernetes-upgrade-685093/_data",
	                "Destination": "/var",
	                "Driver": "local",
	                "Mode": "z",
	                "RW": true,
	                "Propagation": ""
	            },
	            {
	                "Type": "bind",
	                "Source": "/lib/modules",
	                "Destination": "/lib/modules",
	                "Mode": "ro",
	                "RW": false,
	                "Propagation": "rprivate"
	            }
	        ],
	        "Config": {
	            "Hostname": "kubernetes-upgrade-685093",
	            "Domainname": "",
	            "User": "",
	            "AttachStdin": false,
	            "AttachStdout": false,
	            "AttachStderr": false,
	            "ExposedPorts": {
	                "22/tcp": {},
	                "2376/tcp": {},
	                "32443/tcp": {},
	                "5000/tcp": {},
	                "8443/tcp": {}
	            },
	            "Tty": true,
	            "OpenStdin": false,
	            "StdinOnce": false,
	            "Env": [
	                "container=docker",
	                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
	            ],
	            "Cmd": null,
	            "Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b",
	            "Volumes": null,
	            "WorkingDir": "/",
	            "Entrypoint": [
	                "/usr/local/bin/entrypoint",
	                "/sbin/init"
	            ],
	            "OnBuild": null,
	            "Labels": {
	                "created_by.minikube.sigs.k8s.io": "true",
	                "mode.minikube.sigs.k8s.io": "kubernetes-upgrade-685093",
	                "name.minikube.sigs.k8s.io": "kubernetes-upgrade-685093",
	                "role.minikube.sigs.k8s.io": ""
	            },
	            "StopSignal": "SIGRTMIN+3"
	        },
	        "NetworkSettings": {
	            "Bridge": "",
	            "SandboxID": "da7a58a30696f770e03b51e87be7f3fd3ccecce4ae65fb6b83c4a1c4af4e83cb",
	            "SandboxKey": "/var/run/docker/netns/da7a58a30696",
	            "Ports": {
	                "22/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33015"
	                    }
	                ],
	                "2376/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33016"
	                    }
	                ],
	                "32443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33019"
	                    }
	                ],
	                "5000/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33017"
	                    }
	                ],
	                "8443/tcp": [
	                    {
	                        "HostIp": "127.0.0.1",
	                        "HostPort": "33018"
	                    }
	                ]
	            },
	            "HairpinMode": false,
	            "LinkLocalIPv6Address": "",
	            "LinkLocalIPv6PrefixLen": 0,
	            "SecondaryIPAddresses": null,
	            "SecondaryIPv6Addresses": null,
	            "EndpointID": "",
	            "Gateway": "",
	            "GlobalIPv6Address": "",
	            "GlobalIPv6PrefixLen": 0,
	            "IPAddress": "",
	            "IPPrefixLen": 0,
	            "IPv6Gateway": "",
	            "MacAddress": "",
	            "Networks": {
	                "kubernetes-upgrade-685093": {
	                    "IPAMConfig": {
	                        "IPv4Address": "192.168.76.2"
	                    },
	                    "Links": null,
	                    "Aliases": null,
	                    "MacAddress": "02:7f:a7:f6:90:c6",
	                    "DriverOpts": null,
	                    "GwPriority": 0,
	                    "NetworkID": "fa1a096148868647f6b506a8c00232ed455b3bc0c17469056bb3db76a0a3bf42",
	                    "EndpointID": "40b072ecdf625e2fb49289d5df37aa792765c6e3f5329a66aa21f9a1496de34a",
	                    "Gateway": "192.168.76.1",
	                    "IPAddress": "192.168.76.2",
	                    "IPPrefixLen": 24,
	                    "IPv6Gateway": "",
	                    "GlobalIPv6Address": "",
	                    "GlobalIPv6PrefixLen": 0,
	                    "DNSNames": [
	                        "kubernetes-upgrade-685093",
	                        "62852c3ac880"
	                    ]
	                }
	            }
	        }
	    }
	]

                                                
                                                
-- /stdout --
helpers_test.go:247: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-685093 -n kubernetes-upgrade-685093
helpers_test.go:247: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p kubernetes-upgrade-685093 -n kubernetes-upgrade-685093: exit status 2 (325.094293ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:247: status error: exit status 2 (may be ok)
helpers_test.go:252: <<< TestKubernetesUpgrade FAILED: start of post-mortem logs <<<
helpers_test.go:253: ======>  post-mortem[TestKubernetesUpgrade]: minikube logs <======
helpers_test.go:255: (dbg) Run:  out/minikube-linux-arm64 -p kubernetes-upgrade-685093 logs -n 25
helpers_test.go:260: TestKubernetesUpgrade logs: 
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬─────────────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                         ARGS                                                                          │           PROFILE           │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼─────────────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ delete  │ -p insufficient-storage-282691                                                                                                                        │ insufficient-storage-282691 │ jenkins │ v1.37.0 │ 02 Dec 25 19:57 UTC │ 02 Dec 25 19:57 UTC │
	│ start   │ -p NoKubernetes-884696 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd                                   │ NoKubernetes-884696         │ jenkins │ v1.37.0 │ 02 Dec 25 19:57 UTC │                     │
	│ start   │ -p NoKubernetes-884696 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd                                           │ NoKubernetes-884696         │ jenkins │ v1.37.0 │ 02 Dec 25 19:57 UTC │ 02 Dec 25 19:58 UTC │
	│ start   │ -p missing-upgrade-824445 --memory=3072 --driver=docker  --container-runtime=containerd                                                               │ missing-upgrade-824445      │ jenkins │ v1.35.0 │ 02 Dec 25 19:57 UTC │ 02 Dec 25 19:58 UTC │
	│ start   │ -p NoKubernetes-884696 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd                           │ NoKubernetes-884696         │ jenkins │ v1.37.0 │ 02 Dec 25 19:58 UTC │ 02 Dec 25 19:58 UTC │
	│ delete  │ -p NoKubernetes-884696                                                                                                                                │ NoKubernetes-884696         │ jenkins │ v1.37.0 │ 02 Dec 25 19:58 UTC │ 02 Dec 25 19:58 UTC │
	│ start   │ -p NoKubernetes-884696 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd                           │ NoKubernetes-884696         │ jenkins │ v1.37.0 │ 02 Dec 25 19:58 UTC │ 02 Dec 25 19:58 UTC │
	│ ssh     │ -p NoKubernetes-884696 sudo systemctl is-active --quiet service kubelet                                                                               │ NoKubernetes-884696         │ jenkins │ v1.37.0 │ 02 Dec 25 19:58 UTC │                     │
	│ start   │ -p missing-upgrade-824445 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                                        │ missing-upgrade-824445      │ jenkins │ v1.37.0 │ 02 Dec 25 19:58 UTC │ 02 Dec 25 20:00 UTC │
	│ stop    │ -p NoKubernetes-884696                                                                                                                                │ NoKubernetes-884696         │ jenkins │ v1.37.0 │ 02 Dec 25 19:58 UTC │ 02 Dec 25 19:58 UTC │
	│ start   │ -p NoKubernetes-884696 --driver=docker  --container-runtime=containerd                                                                                │ NoKubernetes-884696         │ jenkins │ v1.37.0 │ 02 Dec 25 19:58 UTC │ 02 Dec 25 19:59 UTC │
	│ ssh     │ -p NoKubernetes-884696 sudo systemctl is-active --quiet service kubelet                                                                               │ NoKubernetes-884696         │ jenkins │ v1.37.0 │ 02 Dec 25 19:59 UTC │                     │
	│ delete  │ -p NoKubernetes-884696                                                                                                                                │ NoKubernetes-884696         │ jenkins │ v1.37.0 │ 02 Dec 25 19:59 UTC │ 02 Dec 25 19:59 UTC │
	│ start   │ -p kubernetes-upgrade-685093 --memory=3072 --kubernetes-version=v1.28.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd        │ kubernetes-upgrade-685093   │ jenkins │ v1.37.0 │ 02 Dec 25 19:59 UTC │ 02 Dec 25 19:59 UTC │
	│ stop    │ -p kubernetes-upgrade-685093                                                                                                                          │ kubernetes-upgrade-685093   │ jenkins │ v1.37.0 │ 02 Dec 25 19:59 UTC │ 02 Dec 25 19:59 UTC │
	│ start   │ -p kubernetes-upgrade-685093 --memory=3072 --kubernetes-version=v1.35.0-beta.0 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd │ kubernetes-upgrade-685093   │ jenkins │ v1.37.0 │ 02 Dec 25 19:59 UTC │                     │
	│ delete  │ -p missing-upgrade-824445                                                                                                                             │ missing-upgrade-824445      │ jenkins │ v1.37.0 │ 02 Dec 25 20:00 UTC │ 02 Dec 25 20:00 UTC │
	│ start   │ -p stopped-upgrade-629737 --memory=3072 --vm-driver=docker  --container-runtime=containerd                                                            │ stopped-upgrade-629737      │ jenkins │ v1.35.0 │ 02 Dec 25 20:00 UTC │ 02 Dec 25 20:01 UTC │
	│ stop    │ stopped-upgrade-629737 stop                                                                                                                           │ stopped-upgrade-629737      │ jenkins │ v1.35.0 │ 02 Dec 25 20:01 UTC │ 02 Dec 25 20:01 UTC │
	│ start   │ -p stopped-upgrade-629737 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                                        │ stopped-upgrade-629737      │ jenkins │ v1.37.0 │ 02 Dec 25 20:01 UTC │ 02 Dec 25 20:05 UTC │
	│ delete  │ -p stopped-upgrade-629737                                                                                                                             │ stopped-upgrade-629737      │ jenkins │ v1.37.0 │ 02 Dec 25 20:05 UTC │ 02 Dec 25 20:05 UTC │
	│ start   │ -p running-upgrade-516082 --memory=3072 --vm-driver=docker  --container-runtime=containerd                                                            │ running-upgrade-516082      │ jenkins │ v1.35.0 │ 02 Dec 25 20:06 UTC │ 02 Dec 25 20:06 UTC │
	│ start   │ -p running-upgrade-516082 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd                                        │ running-upgrade-516082      │ jenkins │ v1.37.0 │ 02 Dec 25 20:06 UTC │ 02 Dec 25 20:11 UTC │
	│ delete  │ -p running-upgrade-516082                                                                                                                             │ running-upgrade-516082      │ jenkins │ v1.37.0 │ 02 Dec 25 20:11 UTC │ 02 Dec 25 20:11 UTC │
	│ start   │ -p pause-362069 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd                                       │ pause-362069                │ jenkins │ v1.37.0 │ 02 Dec 25 20:11 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴─────────────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 20:11:13
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 20:11:13.746949  239151 out.go:360] Setting OutFile to fd 1 ...
	I1202 20:11:13.747054  239151 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 20:11:13.747059  239151 out.go:374] Setting ErrFile to fd 2...
	I1202 20:11:13.747062  239151 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 20:11:13.747433  239151 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 20:11:13.747913  239151 out.go:368] Setting JSON to false
	I1202 20:11:13.748847  239151 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":6810,"bootTime":1764699464,"procs":176,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 20:11:13.748937  239151 start.go:143] virtualization:  
	I1202 20:11:13.752631  239151 out.go:179] * [pause-362069] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 20:11:13.755449  239151 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 20:11:13.755522  239151 notify.go:221] Checking for updates...
	I1202 20:11:13.762423  239151 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 20:11:13.765724  239151 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 20:11:13.768874  239151 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 20:11:13.772068  239151 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 20:11:13.775120  239151 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 20:11:13.778564  239151 config.go:182] Loaded profile config "kubernetes-upgrade-685093": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 20:11:13.778670  239151 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 20:11:13.809494  239151 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 20:11:13.809608  239151 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 20:11:13.867468  239151 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:37 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 20:11:13.857866787 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 20:11:13.867558  239151 docker.go:319] overlay module found
	I1202 20:11:13.870740  239151 out.go:179] * Using the docker driver based on user configuration
	I1202 20:11:13.873588  239151 start.go:309] selected driver: docker
	I1202 20:11:13.873595  239151 start.go:927] validating driver "docker" against <nil>
	I1202 20:11:13.873605  239151 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 20:11:13.874330  239151 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 20:11:13.943878  239151 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:4 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:37 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 20:11:13.934106491 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 20:11:13.944018  239151 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1202 20:11:13.944231  239151 start_flags.go:992] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I1202 20:11:13.947322  239151 out.go:179] * Using Docker driver with root privileges
	I1202 20:11:13.950146  239151 cni.go:84] Creating CNI manager for ""
	I1202 20:11:13.950204  239151 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 20:11:13.950214  239151 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1202 20:11:13.950289  239151 start.go:353] cluster config:
	{Name:pause-362069 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:pause-362069 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:c
ontainerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentP
ID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 20:11:13.955257  239151 out.go:179] * Starting "pause-362069" primary control-plane node in "pause-362069" cluster
	I1202 20:11:13.958154  239151 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 20:11:13.961082  239151 out.go:179] * Pulling base image v0.0.48-1764169655-21974 ...
	I1202 20:11:13.963932  239151 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1202 20:11:13.963973  239151 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	I1202 20:11:13.963983  239151 cache.go:65] Caching tarball of preloaded images
	I1202 20:11:13.964072  239151 preload.go:238] Found /home/jenkins/minikube-integration/22021-2487/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4 in cache, skipping download
	I1202 20:11:13.964081  239151 cache.go:68] Finished verifying existence of preloaded tar for v1.34.2 on containerd
	I1202 20:11:13.964207  239151 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/config.json ...
	I1202 20:11:13.964224  239151 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/config.json: {Name:mkff3bbaf5278d14aea4f24e46ba2582616c596f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:11:13.964404  239151 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 20:11:13.992535  239151 image.go:100] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon, skipping pull
	I1202 20:11:13.992546  239151 cache.go:158] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in daemon, skipping load
	I1202 20:11:13.992570  239151 cache.go:243] Successfully downloaded all kic artifacts
	I1202 20:11:13.992598  239151 start.go:360] acquireMachinesLock for pause-362069: {Name:mkd2ccb3af992be021c54c80d06b60441773d1f6 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I1202 20:11:13.992717  239151 start.go:364] duration metric: took 106.053µs to acquireMachinesLock for "pause-362069"
	I1202 20:11:13.992743  239151 start.go:93] Provisioning new machine with config: &{Name:pause-362069 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:pause-362069 Namespace:default APIServerHAVIP: APIServerName:minik
ubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirm
warePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1202 20:11:13.992813  239151 start.go:125] createHost starting for "" (driver="docker")
	I1202 20:11:13.998140  239151 out.go:252] * Creating docker container (CPUs=2, Memory=3072MB) ...
	I1202 20:11:13.998419  239151 start.go:159] libmachine.API.Create for "pause-362069" (driver="docker")
	I1202 20:11:13.998451  239151 client.go:173] LocalClient.Create starting
	I1202 20:11:13.998533  239151 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem
	I1202 20:11:13.998571  239151 main.go:143] libmachine: Decoding PEM data...
	I1202 20:11:13.998589  239151 main.go:143] libmachine: Parsing certificate...
	I1202 20:11:13.998653  239151 main.go:143] libmachine: Reading certificate data from /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem
	I1202 20:11:13.998675  239151 main.go:143] libmachine: Decoding PEM data...
	I1202 20:11:13.998685  239151 main.go:143] libmachine: Parsing certificate...
	I1202 20:11:13.999051  239151 cli_runner.go:164] Run: docker network inspect pause-362069 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	W1202 20:11:14.019407  239151 cli_runner.go:211] docker network inspect pause-362069 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
	I1202 20:11:14.019479  239151 network_create.go:284] running [docker network inspect pause-362069] to gather additional debugging logs...
	I1202 20:11:14.019495  239151 cli_runner.go:164] Run: docker network inspect pause-362069
	W1202 20:11:14.036467  239151 cli_runner.go:211] docker network inspect pause-362069 returned with exit code 1
	I1202 20:11:14.036487  239151 network_create.go:287] error running [docker network inspect pause-362069]: docker network inspect pause-362069: exit status 1
	stdout:
	[]
	
	stderr:
	Error response from daemon: network pause-362069 not found
	I1202 20:11:14.036500  239151 network_create.go:289] output of [docker network inspect pause-362069]: -- stdout --
	[]
	
	-- /stdout --
	** stderr ** 
	Error response from daemon: network pause-362069 not found
	
	** /stderr **
	I1202 20:11:14.036618  239151 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 20:11:14.054279  239151 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-af5a3a112c8c IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:da:13:76:29:c4:21} reservation:<nil>}
	I1202 20:11:14.054548  239151 network.go:211] skipping subnet 192.168.58.0/24 that is taken: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName:br-95c1c690df71 IfaceIPv4:192.168.58.1 IfaceMTU:1500 IfaceMAC:7e:91:a4:f0:94:1c} reservation:<nil>}
	I1202 20:11:14.054840  239151 network.go:211] skipping subnet 192.168.67.0/24 that is taken: &{IP:192.168.67.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.67.0/24 Gateway:192.168.67.1 ClientMin:192.168.67.2 ClientMax:192.168.67.254 Broadcast:192.168.67.255 IsPrivate:true Interface:{IfaceName:br-d322f90e5e54 IfaceIPv4:192.168.67.1 IfaceMTU:1500 IfaceMAC:7e:b6:5b:04:35:f9} reservation:<nil>}
	I1202 20:11:14.055142  239151 network.go:211] skipping subnet 192.168.76.0/24 that is taken: &{IP:192.168.76.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.76.0/24 Gateway:192.168.76.1 ClientMin:192.168.76.2 ClientMax:192.168.76.254 Broadcast:192.168.76.255 IsPrivate:true Interface:{IfaceName:br-fa1a09614886 IfaceIPv4:192.168.76.1 IfaceMTU:1500 IfaceMAC:fa:07:8d:6a:13:df} reservation:<nil>}
	I1202 20:11:14.055570  239151 network.go:206] using free private subnet 192.168.85.0/24: &{IP:192.168.85.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.85.0/24 Gateway:192.168.85.1 ClientMin:192.168.85.2 ClientMax:192.168.85.254 Broadcast:192.168.85.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x40019d5b60}
	I1202 20:11:14.055584  239151 network_create.go:124] attempt to create docker network pause-362069 192.168.85.0/24 with gateway 192.168.85.1 and MTU of 1500 ...
	I1202 20:11:14.055659  239151 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.85.0/24 --gateway=192.168.85.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=pause-362069 pause-362069
	I1202 20:11:14.113523  239151 network_create.go:108] docker network pause-362069 192.168.85.0/24 created
	I1202 20:11:14.113544  239151 kic.go:121] calculated static IP "192.168.85.2" for the "pause-362069" container
	I1202 20:11:14.113627  239151 cli_runner.go:164] Run: docker ps -a --format {{.Names}}
	I1202 20:11:14.130660  239151 cli_runner.go:164] Run: docker volume create pause-362069 --label name.minikube.sigs.k8s.io=pause-362069 --label created_by.minikube.sigs.k8s.io=true
	I1202 20:11:14.149908  239151 oci.go:103] Successfully created a docker volume pause-362069
	I1202 20:11:14.149993  239151 cli_runner.go:164] Run: docker run --rm --name pause-362069-preload-sidecar --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=pause-362069 --entrypoint /usr/bin/test -v pause-362069:/var gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b -d /var/lib
	I1202 20:11:14.687682  239151 oci.go:107] Successfully prepared a docker volume pause-362069
	I1202 20:11:14.687738  239151 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1202 20:11:14.687746  239151 kic.go:194] Starting extracting preloaded images to volume ...
	I1202 20:11:14.687826  239151 cli_runner.go:164] Run: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22021-2487/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v pause-362069:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b -I lz4 -xf /preloaded.tar -C /extractDir
	I1202 20:11:21.409229  239151 cli_runner.go:217] Completed: docker run --rm --entrypoint /usr/bin/tar -v /home/jenkins/minikube-integration/22021-2487/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4:/preloaded.tar:ro -v pause-362069:/extractDir gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b -I lz4 -xf /preloaded.tar -C /extractDir: (6.721367508s)
	I1202 20:11:21.409250  239151 kic.go:203] duration metric: took 6.721500818s to extract preloaded images to volume ...
	W1202 20:11:21.409410  239151 cgroups_linux.go:77] Your kernel does not support swap limit capabilities or the cgroup is not mounted.
	I1202 20:11:21.409510  239151 cli_runner.go:164] Run: docker info --format "'{{json .SecurityOptions}}'"
	I1202 20:11:21.504269  239151 cli_runner.go:164] Run: docker run -d -t --privileged --security-opt seccomp=unconfined --tmpfs /tmp --tmpfs /run -v /lib/modules:/lib/modules:ro --hostname pause-362069 --name pause-362069 --label created_by.minikube.sigs.k8s.io=true --label name.minikube.sigs.k8s.io=pause-362069 --label role.minikube.sigs.k8s.io= --label mode.minikube.sigs.k8s.io=pause-362069 --network pause-362069 --ip 192.168.85.2 --volume pause-362069:/var --security-opt apparmor=unconfined --memory=3072mb --cpus=2 -e container=docker --expose 8443 --publish=127.0.0.1::8443 --publish=127.0.0.1::22 --publish=127.0.0.1::2376 --publish=127.0.0.1::5000 --publish=127.0.0.1::32443 gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b
	I1202 20:11:21.822136  239151 cli_runner.go:164] Run: docker container inspect pause-362069 --format={{.State.Running}}
	I1202 20:11:21.845077  239151 cli_runner.go:164] Run: docker container inspect pause-362069 --format={{.State.Status}}
	I1202 20:11:21.874766  239151 cli_runner.go:164] Run: docker exec pause-362069 stat /var/lib/dpkg/alternatives/iptables
	I1202 20:11:21.928850  239151 oci.go:144] the created container "pause-362069" has a running status.
	I1202 20:11:21.928870  239151 kic.go:225] Creating ssh key for kic: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/pause-362069/id_rsa...
	I1202 20:11:22.045563  239151 kic_runner.go:191] docker (temp): /home/jenkins/minikube-integration/22021-2487/.minikube/machines/pause-362069/id_rsa.pub --> /home/docker/.ssh/authorized_keys (381 bytes)
	I1202 20:11:22.072852  239151 cli_runner.go:164] Run: docker container inspect pause-362069 --format={{.State.Status}}
	I1202 20:11:22.095680  239151 kic_runner.go:93] Run: chown docker:docker /home/docker/.ssh/authorized_keys
	I1202 20:11:22.095693  239151 kic_runner.go:114] Args: [docker exec --privileged pause-362069 chown docker:docker /home/docker/.ssh/authorized_keys]
	I1202 20:11:22.149013  239151 cli_runner.go:164] Run: docker container inspect pause-362069 --format={{.State.Status}}
	I1202 20:11:22.179698  239151 machine.go:94] provisionDockerMachine start ...
	I1202 20:11:22.179783  239151 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-362069
	I1202 20:11:22.214266  239151 main.go:143] libmachine: Using SSH client type: native
	I1202 20:11:22.214592  239151 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33035 <nil> <nil>}
	I1202 20:11:22.214612  239151 main.go:143] libmachine: About to run SSH command:
	hostname
	I1202 20:11:22.215352  239151 main.go:143] libmachine: Error dialing TCP: ssh: handshake failed: EOF
	I1202 20:11:25.368235  239151 main.go:143] libmachine: SSH cmd err, output: <nil>: pause-362069
	
	I1202 20:11:25.368250  239151 ubuntu.go:182] provisioning hostname "pause-362069"
	I1202 20:11:25.368347  239151 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-362069
	I1202 20:11:25.387779  239151 main.go:143] libmachine: Using SSH client type: native
	I1202 20:11:25.388159  239151 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33035 <nil> <nil>}
	I1202 20:11:25.388174  239151 main.go:143] libmachine: About to run SSH command:
	sudo hostname pause-362069 && echo "pause-362069" | sudo tee /etc/hostname
	I1202 20:11:25.550298  239151 main.go:143] libmachine: SSH cmd err, output: <nil>: pause-362069
	
	I1202 20:11:25.550380  239151 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-362069
	I1202 20:11:25.568202  239151 main.go:143] libmachine: Using SSH client type: native
	I1202 20:11:25.568553  239151 main.go:143] libmachine: &{{{<nil> 0 [] [] []} docker [0x3dad70] 0x3dd270 <nil>  [] 0s} 127.0.0.1 33035 <nil> <nil>}
	I1202 20:11:25.568566  239151 main.go:143] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\spause-362069' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 pause-362069/g' /etc/hosts;
				else 
					echo '127.0.1.1 pause-362069' | sudo tee -a /etc/hosts; 
				fi
			fi
	I1202 20:11:25.716597  239151 main.go:143] libmachine: SSH cmd err, output: <nil>: 
	I1202 20:11:25.716614  239151 ubuntu.go:188] set auth options {CertDir:/home/jenkins/minikube-integration/22021-2487/.minikube CaCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/22021-2487/.minikube}
	I1202 20:11:25.716636  239151 ubuntu.go:190] setting up certificates
	I1202 20:11:25.716651  239151 provision.go:84] configureAuth start
	I1202 20:11:25.716712  239151 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" pause-362069
	I1202 20:11:25.734367  239151 provision.go:143] copyHostCerts
	I1202 20:11:25.734425  239151 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem, removing ...
	I1202 20:11:25.734432  239151 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem
	I1202 20:11:25.734509  239151 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/ca.pem (1082 bytes)
	I1202 20:11:25.734618  239151 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem, removing ...
	I1202 20:11:25.734622  239151 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem
	I1202 20:11:25.734648  239151 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/cert.pem (1123 bytes)
	I1202 20:11:25.734707  239151 exec_runner.go:144] found /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem, removing ...
	I1202 20:11:25.734711  239151 exec_runner.go:203] rm: /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem
	I1202 20:11:25.734733  239151 exec_runner.go:151] cp: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/22021-2487/.minikube/key.pem (1675 bytes)
	I1202 20:11:25.734783  239151 provision.go:117] generating server cert: /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem org=jenkins.pause-362069 san=[127.0.0.1 192.168.85.2 localhost minikube pause-362069]
	I1202 20:11:25.938906  239151 provision.go:177] copyRemoteCerts
	I1202 20:11:25.938958  239151 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I1202 20:11:25.938998  239151 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-362069
	I1202 20:11:25.957016  239151 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33035 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/pause-362069/id_rsa Username:docker}
	I1202 20:11:26.068310  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server.pem --> /etc/docker/server.pem (1204 bytes)
	I1202 20:11:26.089158  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I1202 20:11:26.107231  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I1202 20:11:26.125811  239151 provision.go:87] duration metric: took 409.138159ms to configureAuth
	I1202 20:11:26.125836  239151 ubuntu.go:206] setting minikube options for container-runtime
	I1202 20:11:26.126042  239151 config.go:182] Loaded profile config "pause-362069": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 20:11:26.126047  239151 machine.go:97] duration metric: took 3.946339387s to provisionDockerMachine
	I1202 20:11:26.126053  239151 client.go:176] duration metric: took 12.127597964s to LocalClient.Create
	I1202 20:11:26.126075  239151 start.go:167] duration metric: took 12.127655663s to libmachine.API.Create "pause-362069"
	I1202 20:11:26.126081  239151 start.go:293] postStartSetup for "pause-362069" (driver="docker")
	I1202 20:11:26.126088  239151 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I1202 20:11:26.126147  239151 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I1202 20:11:26.126185  239151 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-362069
	I1202 20:11:26.143745  239151 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33035 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/pause-362069/id_rsa Username:docker}
	I1202 20:11:26.248690  239151 ssh_runner.go:195] Run: cat /etc/os-release
	I1202 20:11:26.252154  239151 main.go:143] libmachine: Couldn't set key VERSION_CODENAME, no corresponding struct field found
	I1202 20:11:26.252172  239151 info.go:137] Remote host: Debian GNU/Linux 12 (bookworm)
	I1202 20:11:26.252183  239151 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/addons for local assets ...
	I1202 20:11:26.252237  239151 filesync.go:126] Scanning /home/jenkins/minikube-integration/22021-2487/.minikube/files for local assets ...
	I1202 20:11:26.252308  239151 filesync.go:149] local asset: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem -> 44352.pem in /etc/ssl/certs
	I1202 20:11:26.252432  239151 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I1202 20:11:26.260003  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /etc/ssl/certs/44352.pem (1708 bytes)
	I1202 20:11:26.278652  239151 start.go:296] duration metric: took 152.557866ms for postStartSetup
	I1202 20:11:26.279017  239151 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" pause-362069
	I1202 20:11:26.296036  239151 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/config.json ...
	I1202 20:11:26.296375  239151 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 20:11:26.296415  239151 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-362069
	I1202 20:11:26.313427  239151 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33035 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/pause-362069/id_rsa Username:docker}
	I1202 20:11:26.417735  239151 ssh_runner.go:195] Run: sh -c "df -BG /var | awk 'NR==2{print $4}'"
	I1202 20:11:26.422371  239151 start.go:128] duration metric: took 12.429545275s to createHost
	I1202 20:11:26.422386  239151 start.go:83] releasing machines lock for "pause-362069", held for 12.42966288s
	I1202 20:11:26.422458  239151 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" pause-362069
	I1202 20:11:26.439496  239151 ssh_runner.go:195] Run: cat /version.json
	I1202 20:11:26.439541  239151 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-362069
	I1202 20:11:26.439569  239151 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I1202 20:11:26.439638  239151 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" pause-362069
	I1202 20:11:26.465587  239151 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33035 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/pause-362069/id_rsa Username:docker}
	I1202 20:11:26.466200  239151 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:33035 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/pause-362069/id_rsa Username:docker}
	I1202 20:11:26.663017  239151 ssh_runner.go:195] Run: systemctl --version
	I1202 20:11:26.670387  239151 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W1202 20:11:26.676015  239151 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I1202 20:11:26.676078  239151 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I1202 20:11:26.713531  239151 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist, /etc/cni/net.d/10-crio-bridge.conflist.disabled] bridge cni config(s)
	I1202 20:11:26.713544  239151 start.go:496] detecting cgroup driver to use...
	I1202 20:11:26.713577  239151 detect.go:187] detected "cgroupfs" cgroup driver on host os
	I1202 20:11:26.713631  239151 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I1202 20:11:26.733605  239151 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I1202 20:11:26.752118  239151 docker.go:218] disabling cri-docker service (if available) ...
	I1202 20:11:26.752172  239151 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I1202 20:11:26.770816  239151 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I1202 20:11:26.790013  239151 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I1202 20:11:26.901825  239151 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I1202 20:11:27.038122  239151 docker.go:234] disabling docker service ...
	I1202 20:11:27.038193  239151 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I1202 20:11:27.059968  239151 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I1202 20:11:27.073570  239151 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I1202 20:11:27.197601  239151 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I1202 20:11:27.316178  239151 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I1202 20:11:27.329165  239151 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I1202 20:11:27.342575  239151 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.10.1"|' /etc/containerd/config.toml"
	I1202 20:11:27.351873  239151 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I1202 20:11:27.361731  239151 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I1202 20:11:27.361792  239151 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I1202 20:11:27.371308  239151 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 20:11:27.380128  239151 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I1202 20:11:27.389140  239151 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I1202 20:11:27.397960  239151 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I1202 20:11:27.406264  239151 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I1202 20:11:27.414998  239151 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I1202 20:11:27.424191  239151 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I1202 20:11:27.433669  239151 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I1202 20:11:27.441853  239151 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I1202 20:11:27.449429  239151 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 20:11:27.573037  239151 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I1202 20:11:27.696667  239151 start.go:543] Will wait 60s for socket path /run/containerd/containerd.sock
	I1202 20:11:27.696744  239151 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I1202 20:11:27.700655  239151 start.go:564] Will wait 60s for crictl version
	I1202 20:11:27.700710  239151 ssh_runner.go:195] Run: which crictl
	I1202 20:11:27.704197  239151 ssh_runner.go:195] Run: sudo /usr/local/bin/crictl version
	I1202 20:11:27.730023  239151 start.go:580] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v2.1.5
	RuntimeApiVersion:  v1
	I1202 20:11:27.730080  239151 ssh_runner.go:195] Run: containerd --version
	I1202 20:11:27.755330  239151 ssh_runner.go:195] Run: containerd --version
	I1202 20:11:27.779867  239151 out.go:179] * Preparing Kubernetes v1.34.2 on containerd 2.1.5 ...
	I1202 20:11:27.782786  239151 cli_runner.go:164] Run: docker network inspect pause-362069 --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
	I1202 20:11:27.802357  239151 ssh_runner.go:195] Run: grep 192.168.85.1	host.minikube.internal$ /etc/hosts
	I1202 20:11:27.806317  239151 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.85.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1202 20:11:27.816045  239151 kubeadm.go:884] updating cluster {Name:pause-362069 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:pause-362069 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerName
s:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePat
h: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I1202 20:11:27.816148  239151 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1202 20:11:27.816206  239151 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 20:11:27.843754  239151 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 20:11:27.843766  239151 containerd.go:534] Images already preloaded, skipping extraction
	I1202 20:11:27.843830  239151 ssh_runner.go:195] Run: sudo crictl images --output json
	I1202 20:11:27.871754  239151 containerd.go:627] all images are preloaded for containerd runtime.
	I1202 20:11:27.871766  239151 cache_images.go:86] Images are preloaded, skipping loading
	I1202 20:11:27.871777  239151 kubeadm.go:935] updating node { 192.168.85.2 8443 v1.34.2 containerd true true} ...
	I1202 20:11:27.871877  239151 kubeadm.go:947] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.34.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=pause-362069 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.85.2
	
	[Install]
	 config:
	{KubernetesVersion:v1.34.2 ClusterName:pause-362069 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I1202 20:11:27.871949  239151 ssh_runner.go:195] Run: sudo crictl info
	I1202 20:11:27.897246  239151 cni.go:84] Creating CNI manager for ""
	I1202 20:11:27.897256  239151 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 20:11:27.897270  239151 kubeadm.go:85] Using pod CIDR: 10.244.0.0/16
	I1202 20:11:27.897295  239151 kubeadm.go:190] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.85.2 APIServerPort:8443 KubernetesVersion:v1.34.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:pause-362069 NodeName:pause-362069 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.85.2"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.85.2 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I1202 20:11:27.897433  239151 kubeadm.go:196] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.85.2
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "pause-362069"
	  kubeletExtraArgs:
	    - name: "node-ip"
	      value: "192.168.85.2"
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta4
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.85.2"]
	  extraArgs:
	    - name: "enable-admission-plugins"
	      value: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    - name: "allocate-node-cidrs"
	      value: "true"
	    - name: "leader-elect"
	      value: "false"
	scheduler:
	  extraArgs:
	    - name: "leader-elect"
	      value: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	kubernetesVersion: v1.34.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I1202 20:11:27.897510  239151 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.34.2
	I1202 20:11:27.905768  239151 binaries.go:51] Found k8s binaries, skipping transfer
	I1202 20:11:27.905843  239151 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I1202 20:11:27.913824  239151 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (316 bytes)
	I1202 20:11:27.927134  239151 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I1202 20:11:27.940214  239151 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2225 bytes)
	I1202 20:11:27.953554  239151 ssh_runner.go:195] Run: grep 192.168.85.2	control-plane.minikube.internal$ /etc/hosts
	I1202 20:11:27.957188  239151 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.85.2	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I1202 20:11:27.966950  239151 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 20:11:28.087375  239151 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 20:11:28.104965  239151 certs.go:69] Setting up /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069 for IP: 192.168.85.2
	I1202 20:11:28.104987  239151 certs.go:195] generating shared ca certs ...
	I1202 20:11:28.105012  239151 certs.go:227] acquiring lock for ca certs: {Name:mk2ce7651a779b9fbf8eac798f9ac184328de0c6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:11:28.105173  239151 certs.go:236] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key
	I1202 20:11:28.105232  239151 certs.go:236] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key
	I1202 20:11:28.105239  239151 certs.go:257] generating profile certs ...
	I1202 20:11:28.105299  239151 certs.go:364] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/client.key
	I1202 20:11:28.105310  239151 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/client.crt with IP's: []
	I1202 20:11:28.163215  239151 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/client.crt ...
	I1202 20:11:28.163236  239151 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/client.crt: {Name:mk4d73cd9e89619ef301c7ba32727732c221d1c3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:11:28.163438  239151 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/client.key ...
	I1202 20:11:28.163444  239151 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/client.key: {Name:mkb92c46994ed4337aee1f52b61bba2a8cdbd17b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:11:28.163535  239151 certs.go:364] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.key.d040056b
	I1202 20:11:28.163547  239151 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.crt.d040056b with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.85.2]
	I1202 20:11:29.294758  239151 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.crt.d040056b ...
	I1202 20:11:29.294781  239151 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.crt.d040056b: {Name:mk3c179a89d71c4a6cf554c5181a404cd1a88b53 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:11:29.294984  239151 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.key.d040056b ...
	I1202 20:11:29.294993  239151 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.key.d040056b: {Name:mka8e3d6027bf952ae77e949469e9543a587f53b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:11:29.295084  239151 certs.go:382] copying /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.crt.d040056b -> /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.crt
	I1202 20:11:29.295164  239151 certs.go:386] copying /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.key.d040056b -> /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.key
	I1202 20:11:29.295215  239151 certs.go:364] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/proxy-client.key
	I1202 20:11:29.295227  239151 crypto.go:68] Generating cert /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/proxy-client.crt with IP's: []
	I1202 20:11:29.673371  239151 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/proxy-client.crt ...
	I1202 20:11:29.673386  239151 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/proxy-client.crt: {Name:mk9074678db0626d2f0f4656c232ea743fc5e017 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:11:29.673548  239151 crypto.go:164] Writing key to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/proxy-client.key ...
	I1202 20:11:29.673554  239151 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/proxy-client.key: {Name:mkc4691300c5171c490d06bdbcb93eac8b3bc4ad Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:11:29.673721  239151 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem (1338 bytes)
	W1202 20:11:29.673758  239151 certs.go:480] ignoring /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435_empty.pem, impossibly tiny 0 bytes
	I1202 20:11:29.673765  239151 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca-key.pem (1679 bytes)
	I1202 20:11:29.673793  239151 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/ca.pem (1082 bytes)
	I1202 20:11:29.673843  239151 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/cert.pem (1123 bytes)
	I1202 20:11:29.673869  239151 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/certs/key.pem (1675 bytes)
	I1202 20:11:29.673916  239151 certs.go:484] found cert: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem (1708 bytes)
	I1202 20:11:29.674611  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I1202 20:11:29.702614  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I1202 20:11:29.739848  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I1202 20:11:29.764430  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I1202 20:11:29.782705  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1419 bytes)
	I1202 20:11:29.801764  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I1202 20:11:29.819853  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I1202 20:11:29.837884  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/pause-362069/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I1202 20:11:29.855475  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I1202 20:11:29.873983  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/certs/4435.pem --> /usr/share/ca-certificates/4435.pem (1338 bytes)
	I1202 20:11:29.892880  239151 ssh_runner.go:362] scp /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/ssl/certs/44352.pem --> /usr/share/ca-certificates/44352.pem (1708 bytes)
	I1202 20:11:29.911302  239151 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I1202 20:11:29.924434  239151 ssh_runner.go:195] Run: openssl version
	I1202 20:11:29.932802  239151 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/4435.pem && ln -fs /usr/share/ca-certificates/4435.pem /etc/ssl/certs/4435.pem"
	I1202 20:11:29.942266  239151 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/4435.pem
	I1202 20:11:29.946033  239151 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Dec  2 18:58 /usr/share/ca-certificates/4435.pem
	I1202 20:11:29.946093  239151 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/4435.pem
	I1202 20:11:29.987522  239151 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/4435.pem /etc/ssl/certs/51391683.0"
	I1202 20:11:29.995777  239151 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/44352.pem && ln -fs /usr/share/ca-certificates/44352.pem /etc/ssl/certs/44352.pem"
	I1202 20:11:30.007010  239151 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/44352.pem
	I1202 20:11:30.011708  239151 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Dec  2 18:58 /usr/share/ca-certificates/44352.pem
	I1202 20:11:30.011774  239151 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/44352.pem
	I1202 20:11:30.087639  239151 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/44352.pem /etc/ssl/certs/3ec20f2e.0"
	I1202 20:11:30.097890  239151 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I1202 20:11:30.107418  239151 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I1202 20:11:30.111807  239151 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Dec  2 18:48 /usr/share/ca-certificates/minikubeCA.pem
	I1202 20:11:30.111867  239151 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I1202 20:11:30.154640  239151 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I1202 20:11:30.163546  239151 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I1202 20:11:30.167527  239151 certs.go:400] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I1202 20:11:30.167574  239151 kubeadm.go:401] StartCluster: {Name:pause-362069 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:pause-362069 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[
] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath:
SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 20:11:30.167649  239151 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I1202 20:11:30.167709  239151 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I1202 20:11:30.196798  239151 cri.go:89] found id: ""
	I1202 20:11:30.196885  239151 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I1202 20:11:30.205282  239151 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I1202 20:11:30.213546  239151 kubeadm.go:215] ignoring SystemVerification for kubeadm because of docker driver
	I1202 20:11:30.213609  239151 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I1202 20:11:30.221823  239151 kubeadm.go:156] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I1202 20:11:30.221833  239151 kubeadm.go:158] found existing configuration files:
	
	I1202 20:11:30.221890  239151 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I1202 20:11:30.230296  239151 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I1202 20:11:30.230354  239151 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I1202 20:11:30.237851  239151 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I1202 20:11:30.245813  239151 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I1202 20:11:30.245871  239151 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I1202 20:11:30.253809  239151 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I1202 20:11:30.262712  239151 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I1202 20:11:30.262774  239151 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I1202 20:11:30.270581  239151 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I1202 20:11:30.278653  239151 kubeadm.go:164] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I1202 20:11:30.278713  239151 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I1202 20:11:30.286565  239151 ssh_runner.go:286] Start: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.34.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables"
	I1202 20:11:30.352879  239151 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is in maintenance mode, please migrate to cgroups v2
	I1202 20:11:30.353101  239151 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 20:11:30.432810  239151 kubeadm.go:319] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 20:11:46.809057  239151 kubeadm.go:319] [init] Using Kubernetes version: v1.34.2
	I1202 20:11:46.809106  239151 kubeadm.go:319] [preflight] Running pre-flight checks
	I1202 20:11:46.809194  239151 kubeadm.go:319] [preflight] The system verification failed. Printing the output from the verification:
	I1202 20:11:46.809248  239151 kubeadm.go:319] KERNEL_VERSION: 5.15.0-1084-aws
	I1202 20:11:46.809283  239151 kubeadm.go:319] OS: Linux
	I1202 20:11:46.809327  239151 kubeadm.go:319] CGROUPS_CPU: enabled
	I1202 20:11:46.809374  239151 kubeadm.go:319] CGROUPS_CPUACCT: enabled
	I1202 20:11:46.809419  239151 kubeadm.go:319] CGROUPS_CPUSET: enabled
	I1202 20:11:46.809465  239151 kubeadm.go:319] CGROUPS_DEVICES: enabled
	I1202 20:11:46.809512  239151 kubeadm.go:319] CGROUPS_FREEZER: enabled
	I1202 20:11:46.809561  239151 kubeadm.go:319] CGROUPS_MEMORY: enabled
	I1202 20:11:46.809605  239151 kubeadm.go:319] CGROUPS_PIDS: enabled
	I1202 20:11:46.809651  239151 kubeadm.go:319] CGROUPS_HUGETLB: enabled
	I1202 20:11:46.809696  239151 kubeadm.go:319] CGROUPS_BLKIO: enabled
	I1202 20:11:46.809766  239151 kubeadm.go:319] [preflight] Pulling images required for setting up a Kubernetes cluster
	I1202 20:11:46.809859  239151 kubeadm.go:319] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I1202 20:11:46.809947  239151 kubeadm.go:319] [preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	I1202 20:11:46.810008  239151 kubeadm.go:319] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I1202 20:11:46.813093  239151 out.go:252]   - Generating certificates and keys ...
	I1202 20:11:46.813187  239151 kubeadm.go:319] [certs] Using existing ca certificate authority
	I1202 20:11:46.813251  239151 kubeadm.go:319] [certs] Using existing apiserver certificate and key on disk
	I1202 20:11:46.813317  239151 kubeadm.go:319] [certs] Generating "apiserver-kubelet-client" certificate and key
	I1202 20:11:46.813382  239151 kubeadm.go:319] [certs] Generating "front-proxy-ca" certificate and key
	I1202 20:11:46.813447  239151 kubeadm.go:319] [certs] Generating "front-proxy-client" certificate and key
	I1202 20:11:46.813502  239151 kubeadm.go:319] [certs] Generating "etcd/ca" certificate and key
	I1202 20:11:46.813556  239151 kubeadm.go:319] [certs] Generating "etcd/server" certificate and key
	I1202 20:11:46.813707  239151 kubeadm.go:319] [certs] etcd/server serving cert is signed for DNS names [localhost pause-362069] and IPs [192.168.85.2 127.0.0.1 ::1]
	I1202 20:11:46.813778  239151 kubeadm.go:319] [certs] Generating "etcd/peer" certificate and key
	I1202 20:11:46.813901  239151 kubeadm.go:319] [certs] etcd/peer serving cert is signed for DNS names [localhost pause-362069] and IPs [192.168.85.2 127.0.0.1 ::1]
	I1202 20:11:46.813976  239151 kubeadm.go:319] [certs] Generating "etcd/healthcheck-client" certificate and key
	I1202 20:11:46.814038  239151 kubeadm.go:319] [certs] Generating "apiserver-etcd-client" certificate and key
	I1202 20:11:46.814102  239151 kubeadm.go:319] [certs] Generating "sa" key and public key
	I1202 20:11:46.814168  239151 kubeadm.go:319] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I1202 20:11:46.814222  239151 kubeadm.go:319] [kubeconfig] Writing "admin.conf" kubeconfig file
	I1202 20:11:46.814290  239151 kubeadm.go:319] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I1202 20:11:46.814344  239151 kubeadm.go:319] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I1202 20:11:46.814406  239151 kubeadm.go:319] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I1202 20:11:46.814458  239151 kubeadm.go:319] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I1202 20:11:46.814548  239151 kubeadm.go:319] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I1202 20:11:46.814618  239151 kubeadm.go:319] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I1202 20:11:46.817473  239151 out.go:252]   - Booting up control plane ...
	I1202 20:11:46.817613  239151 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I1202 20:11:46.817691  239151 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I1202 20:11:46.817757  239151 kubeadm.go:319] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I1202 20:11:46.817877  239151 kubeadm.go:319] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I1202 20:11:46.817987  239151 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	I1202 20:11:46.818126  239151 kubeadm.go:319] [patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	I1202 20:11:46.818220  239151 kubeadm.go:319] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I1202 20:11:46.818262  239151 kubeadm.go:319] [kubelet-start] Starting the kubelet
	I1202 20:11:46.818399  239151 kubeadm.go:319] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I1202 20:11:46.818515  239151 kubeadm.go:319] [kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	I1202 20:11:46.818613  239151 kubeadm.go:319] [kubelet-check] The kubelet is healthy after 1.501821042s
	I1202 20:11:46.818706  239151 kubeadm.go:319] [control-plane-check] Waiting for healthy control plane components. This can take up to 4m0s
	I1202 20:11:46.818788  239151 kubeadm.go:319] [control-plane-check] Checking kube-apiserver at https://192.168.85.2:8443/livez
	I1202 20:11:46.818908  239151 kubeadm.go:319] [control-plane-check] Checking kube-controller-manager at https://127.0.0.1:10257/healthz
	I1202 20:11:46.818998  239151 kubeadm.go:319] [control-plane-check] Checking kube-scheduler at https://127.0.0.1:10259/livez
	I1202 20:11:46.819072  239151 kubeadm.go:319] [control-plane-check] kube-controller-manager is healthy after 5.158455913s
	I1202 20:11:46.819138  239151 kubeadm.go:319] [control-plane-check] kube-scheduler is healthy after 5.491380524s
	I1202 20:11:46.819204  239151 kubeadm.go:319] [control-plane-check] kube-apiserver is healthy after 7.001648606s
	I1202 20:11:46.819309  239151 kubeadm.go:319] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I1202 20:11:46.819433  239151 kubeadm.go:319] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I1202 20:11:46.819490  239151 kubeadm.go:319] [upload-certs] Skipping phase. Please see --upload-certs
	I1202 20:11:46.819683  239151 kubeadm.go:319] [mark-control-plane] Marking the node pause-362069 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I1202 20:11:46.819737  239151 kubeadm.go:319] [bootstrap-token] Using token: loclpu.wno9qy4hudc9uvtn
	I1202 20:11:46.822771  239151 out.go:252]   - Configuring RBAC rules ...
	I1202 20:11:46.822924  239151 kubeadm.go:319] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I1202 20:11:46.823023  239151 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I1202 20:11:46.823188  239151 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I1202 20:11:46.823317  239151 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I1202 20:11:46.823434  239151 kubeadm.go:319] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I1202 20:11:46.823518  239151 kubeadm.go:319] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I1202 20:11:46.823630  239151 kubeadm.go:319] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I1202 20:11:46.823672  239151 kubeadm.go:319] [addons] Applied essential addon: CoreDNS
	I1202 20:11:46.823716  239151 kubeadm.go:319] [addons] Applied essential addon: kube-proxy
	I1202 20:11:46.823718  239151 kubeadm.go:319] 
	I1202 20:11:46.823777  239151 kubeadm.go:319] Your Kubernetes control-plane has initialized successfully!
	I1202 20:11:46.823780  239151 kubeadm.go:319] 
	I1202 20:11:46.823855  239151 kubeadm.go:319] To start using your cluster, you need to run the following as a regular user:
	I1202 20:11:46.823858  239151 kubeadm.go:319] 
	I1202 20:11:46.823882  239151 kubeadm.go:319]   mkdir -p $HOME/.kube
	I1202 20:11:46.823940  239151 kubeadm.go:319]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I1202 20:11:46.823989  239151 kubeadm.go:319]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I1202 20:11:46.823991  239151 kubeadm.go:319] 
	I1202 20:11:46.824044  239151 kubeadm.go:319] Alternatively, if you are the root user, you can run:
	I1202 20:11:46.824046  239151 kubeadm.go:319] 
	I1202 20:11:46.824092  239151 kubeadm.go:319]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I1202 20:11:46.824095  239151 kubeadm.go:319] 
	I1202 20:11:46.824146  239151 kubeadm.go:319] You should now deploy a pod network to the cluster.
	I1202 20:11:46.824219  239151 kubeadm.go:319] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I1202 20:11:46.824286  239151 kubeadm.go:319]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I1202 20:11:46.824289  239151 kubeadm.go:319] 
	I1202 20:11:46.824496  239151 kubeadm.go:319] You can now join any number of control-plane nodes by copying certificate authorities
	I1202 20:11:46.824572  239151 kubeadm.go:319] and service account keys on each node and then running the following as root:
	I1202 20:11:46.824574  239151 kubeadm.go:319] 
	I1202 20:11:46.824657  239151 kubeadm.go:319]   kubeadm join control-plane.minikube.internal:8443 --token loclpu.wno9qy4hudc9uvtn \
	I1202 20:11:46.824759  239151 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:960e83e852433fb1d558aa2742ca72f6a7aa0fbd856ca4d8169e82a13f086e68 \
	I1202 20:11:46.824777  239151 kubeadm.go:319] 	--control-plane 
	I1202 20:11:46.824780  239151 kubeadm.go:319] 
	I1202 20:11:46.824864  239151 kubeadm.go:319] Then you can join any number of worker nodes by running the following on each as root:
	I1202 20:11:46.824866  239151 kubeadm.go:319] 
	I1202 20:11:46.824948  239151 kubeadm.go:319] kubeadm join control-plane.minikube.internal:8443 --token loclpu.wno9qy4hudc9uvtn \
	I1202 20:11:46.825062  239151 kubeadm.go:319] 	--discovery-token-ca-cert-hash sha256:960e83e852433fb1d558aa2742ca72f6a7aa0fbd856ca4d8169e82a13f086e68 
	I1202 20:11:46.825070  239151 cni.go:84] Creating CNI manager for ""
	I1202 20:11:46.825077  239151 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 20:11:46.828205  239151 out.go:179] * Configuring CNI (Container Networking Interface) ...
	I1202 20:11:46.831216  239151 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I1202 20:11:46.835587  239151 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.34.2/kubectl ...
	I1202 20:11:46.835597  239151 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2601 bytes)
	I1202 20:11:46.848928  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I1202 20:11:47.170652  239151 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I1202 20:11:47.170726  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I1202 20:11:47.170802  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes pause-362069 minikube.k8s.io/updated_at=2025_12_02T20_11_47_0700 minikube.k8s.io/version=v1.37.0 minikube.k8s.io/commit=f814d1da9a9aaec9cd0504e94606ef30589e1689 minikube.k8s.io/name=pause-362069 minikube.k8s.io/primary=true
	I1202 20:11:47.453368  239151 ops.go:34] apiserver oom_adj: -16
	I1202 20:11:47.453464  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1202 20:11:47.954577  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1202 20:11:48.454118  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1202 20:11:48.953838  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1202 20:11:49.454227  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1202 20:11:49.954186  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1202 20:11:50.453570  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1202 20:11:50.953956  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1202 20:11:51.453592  239151 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.34.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I1202 20:11:51.621691  239151 kubeadm.go:1114] duration metric: took 4.451026382s to wait for elevateKubeSystemPrivileges
	I1202 20:11:51.621710  239151 kubeadm.go:403] duration metric: took 21.454141391s to StartCluster
	I1202 20:11:51.621724  239151 settings.go:142] acquiring lock: {Name:mka76ea0dcf16fdbb68808885f8360c0083029b4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:11:51.621801  239151 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 20:11:51.622742  239151 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/kubeconfig: {Name:mka13b3f16f6b2d645ade32cb83bfcf203300413 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 20:11:51.622932  239151 start.go:236] Will wait 6m0s for node &{Name: IP:192.168.85.2 Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I1202 20:11:51.623010  239151 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I1202 20:11:51.623253  239151 config.go:182] Loaded profile config "pause-362069": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 20:11:51.626069  239151 out.go:179] * Verifying Kubernetes components...
	I1202 20:11:51.628905  239151 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I1202 20:11:51.861383  239151 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.85.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.34.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I1202 20:11:51.883086  239151 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I1202 20:11:52.331331  239151 start.go:977] {"host.minikube.internal": 192.168.85.1} host record injected into CoreDNS's ConfigMap
	I1202 20:11:52.333704  239151 node_ready.go:35] waiting up to 6m0s for node "pause-362069" to be "Ready" ...
	I1202 20:11:52.836800  239151 kapi.go:214] "coredns" deployment in "kube-system" namespace and "pause-362069" context rescaled to 1 replicas
	W1202 20:11:54.336777  239151 node_ready.go:57] node "pause-362069" has "Ready":"False" status (will retry)
	W1202 20:11:56.836915  239151 node_ready.go:57] node "pause-362069" has "Ready":"False" status (will retry)
	W1202 20:11:59.336389  239151 node_ready.go:57] node "pause-362069" has "Ready":"False" status (will retry)
	W1202 20:12:01.336602  239151 node_ready.go:57] node "pause-362069" has "Ready":"False" status (will retry)
	W1202 20:12:03.336837  239151 node_ready.go:57] node "pause-362069" has "Ready":"False" status (will retry)
	W1202 20:12:05.836913  239151 node_ready.go:57] node "pause-362069" has "Ready":"False" status (will retry)
	W1202 20:12:07.836990  239151 node_ready.go:57] node "pause-362069" has "Ready":"False" status (will retry)
	I1202 20:12:12.226574  202120 kubeadm.go:319] [kubelet-check] The kubelet is not healthy after 4m0.000313622s
	I1202 20:12:12.226609  202120 kubeadm.go:319] 
	I1202 20:12:12.226667  202120 kubeadm.go:319] Unfortunately, an error has occurred, likely caused by:
	I1202 20:12:12.226700  202120 kubeadm.go:319] 	- The kubelet is not running
	I1202 20:12:12.226805  202120 kubeadm.go:319] 	- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	I1202 20:12:12.226811  202120 kubeadm.go:319] 
	I1202 20:12:12.226915  202120 kubeadm.go:319] If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
	I1202 20:12:12.226947  202120 kubeadm.go:319] 	- 'systemctl status kubelet'
	I1202 20:12:12.226977  202120 kubeadm.go:319] 	- 'journalctl -xeu kubelet'
	I1202 20:12:12.226981  202120 kubeadm.go:319] 
	I1202 20:12:12.231714  202120 kubeadm.go:319] 	[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
	I1202 20:12:12.232192  202120 kubeadm.go:319] 	[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
	I1202 20:12:12.232313  202120 kubeadm.go:319] 	[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I1202 20:12:12.232580  202120 kubeadm.go:319] error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	I1202 20:12:12.232591  202120 kubeadm.go:319] 
	I1202 20:12:12.232661  202120 kubeadm.go:319] To see the stack trace of this error execute with --v=5 or higher
	I1202 20:12:12.232725  202120 kubeadm.go:403] duration metric: took 12m7.280851867s to StartCluster
	I1202 20:12:12.232775  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I1202 20:12:12.232852  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I1202 20:12:12.259777  202120 cri.go:89] found id: ""
	I1202 20:12:12.259803  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.259812  202120 logs.go:284] No container was found matching "kube-apiserver"
	I1202 20:12:12.259820  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I1202 20:12:12.259885  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I1202 20:12:12.286765  202120 cri.go:89] found id: ""
	I1202 20:12:12.286791  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.286799  202120 logs.go:284] No container was found matching "etcd"
	I1202 20:12:12.286806  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I1202 20:12:12.286865  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I1202 20:12:12.310876  202120 cri.go:89] found id: ""
	I1202 20:12:12.310911  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.310919  202120 logs.go:284] No container was found matching "coredns"
	I1202 20:12:12.310926  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I1202 20:12:12.310986  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I1202 20:12:12.342125  202120 cri.go:89] found id: ""
	I1202 20:12:12.342148  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.342157  202120 logs.go:284] No container was found matching "kube-scheduler"
	I1202 20:12:12.342163  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I1202 20:12:12.342222  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I1202 20:12:12.369246  202120 cri.go:89] found id: ""
	I1202 20:12:12.369270  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.369279  202120 logs.go:284] No container was found matching "kube-proxy"
	I1202 20:12:12.369286  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I1202 20:12:12.369346  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I1202 20:12:12.400088  202120 cri.go:89] found id: ""
	I1202 20:12:12.400111  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.400120  202120 logs.go:284] No container was found matching "kube-controller-manager"
	I1202 20:12:12.400126  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I1202 20:12:12.400184  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I1202 20:12:12.431257  202120 cri.go:89] found id: ""
	I1202 20:12:12.431280  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.431289  202120 logs.go:284] No container was found matching "kindnet"
	I1202 20:12:12.431295  202120 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:storage-provisioner Namespaces:[]}
	I1202 20:12:12.431354  202120 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=storage-provisioner
	I1202 20:12:12.461161  202120 cri.go:89] found id: ""
	I1202 20:12:12.461183  202120 logs.go:282] 0 containers: []
	W1202 20:12:12.461191  202120 logs.go:284] No container was found matching "storage-provisioner"
	I1202 20:12:12.461201  202120 logs.go:123] Gathering logs for dmesg ...
	I1202 20:12:12.461213  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I1202 20:12:12.476068  202120 logs.go:123] Gathering logs for describe nodes ...
	I1202 20:12:12.476091  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	W1202 20:12:12.547327  202120 logs.go:130] failed describe nodes: command: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	 output: 
	** stderr ** 
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	** /stderr **
	I1202 20:12:12.547349  202120 logs.go:123] Gathering logs for containerd ...
	I1202 20:12:12.547361  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I1202 20:12:12.586389  202120 logs.go:123] Gathering logs for container status ...
	I1202 20:12:12.586422  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I1202 20:12:12.618762  202120 logs.go:123] Gathering logs for kubelet ...
	I1202 20:12:12.618789  202120 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	W1202 20:12:12.679897  202120 out.go:434] Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000313622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	W1202 20:12:12.679958  202120 out.go:285] * 
	W1202 20:12:12.680015  202120 out.go:285] X Error starting cluster: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000313622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 20:12:12.680033  202120 out.go:285] * 
	W1202 20:12:12.682162  202120 out.go:308] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I1202 20:12:12.687985  202120 out.go:203] 
	W1202 20:12:12.691983  202120 out.go:285] X Exiting due to K8S_KUBELET_NOT_RUNNING: wait: sudo /bin/bash -c "env PATH="/var/lib/minikube/binaries/v1.35.0-beta.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem,SystemVerification,FileContent--proc-sys-net-bridge-bridge-nf-call-iptables": Process exited with status 1
	stdout:
	[init] Using Kubernetes version: v1.35.0-beta.0
	[preflight] Running pre-flight checks
	[preflight] The system verification failed. Printing the output from the verification:
	KERNEL_VERSION: 5.15.0-1084-aws
	OS: Linux
	CGROUPS_CPU: enabled
	CGROUPS_CPUACCT: enabled
	CGROUPS_CPUSET: enabled
	CGROUPS_DEVICES: enabled
	CGROUPS_FREEZER: enabled
	CGROUPS_MEMORY: enabled
	CGROUPS_PIDS: enabled
	CGROUPS_HUGETLB: enabled
	CGROUPS_BLKIO: enabled
	[preflight] Pulling images required for setting up a Kubernetes cluster
	[preflight] This might take a minute or two, depending on the speed of your internet connection
	[preflight] You can also perform this action beforehand using 'kubeadm config images pull'
	[certs] Using certificateDir folder "/var/lib/minikube/certs"
	[certs] Using existing ca certificate authority
	[certs] Using existing apiserver certificate and key on disk
	[certs] Using existing apiserver-kubelet-client certificate and key on disk
	[certs] Using existing front-proxy-ca certificate authority
	[certs] Using existing front-proxy-client certificate and key on disk
	[certs] Using existing etcd/ca certificate authority
	[certs] Using existing etcd/server certificate and key on disk
	[certs] Using existing etcd/peer certificate and key on disk
	[certs] Using existing etcd/healthcheck-client certificate and key on disk
	[certs] Using existing apiserver-etcd-client certificate and key on disk
	[certs] Using the existing "sa" key
	[kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	[kubeconfig] Writing "admin.conf" kubeconfig file
	[kubeconfig] Writing "super-admin.conf" kubeconfig file
	[kubeconfig] Writing "kubelet.conf" kubeconfig file
	[kubeconfig] Writing "controller-manager.conf" kubeconfig file
	[kubeconfig] Writing "scheduler.conf" kubeconfig file
	[etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	[control-plane] Using manifest folder "/etc/kubernetes/manifests"
	[control-plane] Creating static Pod manifest for "kube-apiserver"
	[control-plane] Creating static Pod manifest for "kube-controller-manager"
	[control-plane] Creating static Pod manifest for "kube-scheduler"
	[kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/instance-config.yaml"
	[patches] Applied patch of type "application/strategic-merge-patch+json" to target "kubeletconfiguration"
	[kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	[kubelet-start] Starting the kubelet
	[wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	[kubelet-check] Waiting for a healthy kubelet at http://127.0.0.1:10248/healthz. This can take up to 4m0s
	[kubelet-check] The kubelet is not healthy after 4m0.000313622s
	
	Unfortunately, an error has occurred, likely caused by:
		- The kubelet is not running
		- The kubelet is unhealthy due to a misconfiguration of the node in some way (required cgroups disabled)
	
	If you are on a systemd-powered system, you can try to troubleshoot the error with the following commands:
		- 'systemctl status kubelet'
		- 'journalctl -xeu kubelet'
	
	
	stderr:
		[WARNING SystemVerification]: failed to parse kernel config: unable to load kernel module: "configs", output: "modprobe: FATAL: Module configs not found in directory /lib/modules/5.15.0-1084-aws\n", err: exit status 1
		[WARNING SystemVerification]: cgroups v1 support is deprecated and will be removed in a future release. Please migrate to cgroups v2. To explicitly enable cgroups v1 support for kubelet v1.35 or newer, you must set the kubelet configuration option 'FailCgroupV1' to 'false'. You must also explicitly skip this validation. For more information, see https://git.k8s.io/enhancements/keps/sig-node/5573-remove-cgroup-v1
		[WARNING Service-kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	error: error execution phase wait-control-plane: failed while waiting for the kubelet to start: The HTTP call equal to 'curl -sSL http://127.0.0.1:10248/healthz' returned error: Get "http://127.0.0.1:10248/healthz": context deadline exceeded
	
	To see the stack trace of this error execute with --v=5 or higher
	
	W1202 20:12:12.692054  202120 out.go:285] * Suggestion: Check output of 'journalctl -xeu kubelet', try passing --extra-config=kubelet.cgroup-driver=systemd to minikube start
	W1202 20:12:12.692078  202120 out.go:285] * Related issue: https://github.com/kubernetes/minikube/issues/4172
	I1202 20:12:12.695545  202120 out.go:203] 
	W1202 20:12:10.336515  239151 node_ready.go:57] node "pause-362069" has "Ready":"False" status (will retry)
	W1202 20:12:12.337645  239151 node_ready.go:57] node "pause-362069" has "Ready":"False" status (will retry)
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                ATTEMPT             POD ID              POD                 NAMESPACE
	
	
	==> containerd <==
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.766153567Z" level=info msg="StopPodSandbox for \"ad7aa36890528e29340c00464db4f28ddddeebee33fcf5a11fb6e9f1d753c421\" returns successfully"
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.766516959Z" level=info msg="RemovePodSandbox for \"ad7aa36890528e29340c00464db4f28ddddeebee33fcf5a11fb6e9f1d753c421\""
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.766555253Z" level=info msg="Forcibly stopping sandbox \"ad7aa36890528e29340c00464db4f28ddddeebee33fcf5a11fb6e9f1d753c421\""
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.766592053Z" level=info msg="Container to stop \"80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3\" must be in running or unknown state, current state \"CONTAINER_EXITED\""
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.766977968Z" level=info msg="TearDown network for sandbox \"ad7aa36890528e29340c00464db4f28ddddeebee33fcf5a11fb6e9f1d753c421\" successfully"
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.775085489Z" level=info msg="Ensure that sandbox ad7aa36890528e29340c00464db4f28ddddeebee33fcf5a11fb6e9f1d753c421 in task-service has been cleanup successfully"
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.783218881Z" level=info msg="RemovePodSandbox \"ad7aa36890528e29340c00464db4f28ddddeebee33fcf5a11fb6e9f1d753c421\" returns successfully"
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.783918473Z" level=info msg="StopPodSandbox for \"b9beb6264777334e01df3c2c227929cfa813dbeade0187f08445a0dfa3adb10b\""
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.783990416Z" level=info msg="Container to stop \"f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53\" must be in running or unknown state, current state \"CONTAINER_EXITED\""
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.784392618Z" level=info msg="TearDown network for sandbox \"b9beb6264777334e01df3c2c227929cfa813dbeade0187f08445a0dfa3adb10b\" successfully"
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.784439552Z" level=info msg="StopPodSandbox for \"b9beb6264777334e01df3c2c227929cfa813dbeade0187f08445a0dfa3adb10b\" returns successfully"
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.784849049Z" level=info msg="RemovePodSandbox for \"b9beb6264777334e01df3c2c227929cfa813dbeade0187f08445a0dfa3adb10b\""
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.784876848Z" level=info msg="Forcibly stopping sandbox \"b9beb6264777334e01df3c2c227929cfa813dbeade0187f08445a0dfa3adb10b\""
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.784906296Z" level=info msg="Container to stop \"f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53\" must be in running or unknown state, current state \"CONTAINER_EXITED\""
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.785248970Z" level=info msg="TearDown network for sandbox \"b9beb6264777334e01df3c2c227929cfa813dbeade0187f08445a0dfa3adb10b\" successfully"
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.793768966Z" level=info msg="Ensure that sandbox b9beb6264777334e01df3c2c227929cfa813dbeade0187f08445a0dfa3adb10b in task-service has been cleanup successfully"
	Dec 02 20:04:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:04:08.799909005Z" level=info msg="RemovePodSandbox \"b9beb6264777334e01df3c2c227929cfa813dbeade0187f08445a0dfa3adb10b\" returns successfully"
	Dec 02 20:09:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:09:08.740077745Z" level=info msg="container event discarded" container=18bdbe615499271a9c902aeebdace359887771cab76a730b8941b46789b08d92 type=CONTAINER_DELETED_EVENT
	Dec 02 20:09:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:09:08.755433599Z" level=info msg="container event discarded" container=f793b8f7bdc8f8b0a08e19d8fb9ce491d4c3935a3db1bb352b5cbea1d6156380 type=CONTAINER_DELETED_EVENT
	Dec 02 20:09:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:09:08.767914551Z" level=info msg="container event discarded" container=86ccd9b7bfab4ca04e870c1fe6c35b2663fa612c4d28da9464c93233d2f77992 type=CONTAINER_DELETED_EVENT
	Dec 02 20:09:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:09:08.767979660Z" level=info msg="container event discarded" container=aa8dd729ea64c5db1502ceceb66dc9670d7d25e600f91ec438f2734ce66ec981 type=CONTAINER_DELETED_EVENT
	Dec 02 20:09:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:09:08.786433598Z" level=info msg="container event discarded" container=80a9896cb78d6cbb4d433582337f4827bf11afd4f8324f56abe48ff07b4057c3 type=CONTAINER_DELETED_EVENT
	Dec 02 20:09:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:09:08.786502350Z" level=info msg="container event discarded" container=ad7aa36890528e29340c00464db4f28ddddeebee33fcf5a11fb6e9f1d753c421 type=CONTAINER_DELETED_EVENT
	Dec 02 20:09:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:09:08.804724277Z" level=info msg="container event discarded" container=f20b5b85386ea1e77c7018b322ab3fe6af6444e301619bae776fca3ef056af53 type=CONTAINER_DELETED_EVENT
	Dec 02 20:09:08 kubernetes-upgrade-685093 containerd[557]: time="2025-12-02T20:09:08.804792118Z" level=info msg="container event discarded" container=b9beb6264777334e01df3c2c227929cfa813dbeade0187f08445a0dfa3adb10b type=CONTAINER_DELETED_EVENT
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.35.0-beta.0/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[Dec 2 18:17] ACPI: SRAT not present
	[  +0.000000] ACPI: SRAT not present
	[  +0.000000] SPI driver altr_a10sr has no spi_device_id for altr,a10sr
	[  +0.015127] device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
	[  +0.494583] systemd[1]: Configuration file /run/systemd/system/netplan-ovs-cleanup.service is marked world-inaccessible. This has no effect as configuration data is accessible via APIs without restrictions. Proceeding anyway.
	[  +0.035754] systemd[1]: /lib/systemd/system/snapd.service:23: Unknown key name 'RestartMode' in section 'Service', ignoring.
	[  +0.870945] ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy.
	[  +6.299680] kauditd_printk_skb: 36 callbacks suppressed
	
	
	==> kernel <==
	 20:12:14 up  1:54,  0 user,  load average: 1.17, 1.24, 1.48
	Linux kubernetes-upgrade-685093 5.15.0-1084-aws #91~20.04.1-Ubuntu SMP Fri May 2 07:00:04 UTC 2025 aarch64 GNU/Linux
	PRETTY_NAME="Debian GNU/Linux 12 (bookworm)"
	
	
	==> kubelet <==
	Dec 02 20:12:10 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 20:12:11 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 319.
	Dec 02 20:12:11 kubernetes-upgrade-685093 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 20:12:11 kubernetes-upgrade-685093 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 20:12:11 kubernetes-upgrade-685093 kubelet[14474]: E1202 20:12:11.717693   14474 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 20:12:11 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 20:12:11 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 20:12:12 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 320.
	Dec 02 20:12:12 kubernetes-upgrade-685093 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 20:12:12 kubernetes-upgrade-685093 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 20:12:12 kubernetes-upgrade-685093 kubelet[14531]: E1202 20:12:12.487650   14531 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 20:12:12 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 20:12:12 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 20:12:13 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 321.
	Dec 02 20:12:13 kubernetes-upgrade-685093 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 20:12:13 kubernetes-upgrade-685093 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 20:12:13 kubernetes-upgrade-685093 kubelet[14569]: E1202 20:12:13.281221   14569 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 20:12:13 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 20:12:13 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	Dec 02 20:12:13 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 322.
	Dec 02 20:12:13 kubernetes-upgrade-685093 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 20:12:13 kubernetes-upgrade-685093 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent.
	Dec 02 20:12:13 kubernetes-upgrade-685093 kubelet[14589]: E1202 20:12:13.991808   14589 run.go:72] "command failed" err="failed to validate kubelet configuration, error: kubelet is configured to not run on a host using cgroup v1. cgroup v1 support is unsupported and will be removed in a future release, path: &TypeMeta{Kind:,APIVersion:,}"
	Dec 02 20:12:13 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE
	Dec 02 20:12:13 kubernetes-upgrade-685093 systemd[1]: kubelet.service: Failed with result 'exit-code'.
	

                                                
                                                
-- /stdout --
helpers_test.go:262: (dbg) Run:  out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-685093 -n kubernetes-upgrade-685093
helpers_test.go:262: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.APIServer}} -p kubernetes-upgrade-685093 -n kubernetes-upgrade-685093: exit status 2 (355.907306ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:262: status error: exit status 2 (may be ok)
helpers_test.go:264: "kubernetes-upgrade-685093" apiserver is not running, skipping kubectl commands (state="Stopped")
helpers_test.go:175: Cleaning up "kubernetes-upgrade-685093" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p kubernetes-upgrade-685093
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p kubernetes-upgrade-685093: (2.323538183s)
--- FAIL: TestKubernetesUpgrade (792.37s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (7200.143s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:285: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:45:29.434720    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kindnet-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:45:41.816312    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/auto-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:45:46.065977    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
I1202 20:46:01.909750    4435 config.go:182] Loaded profile config "flannel-987034": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:46:03.868522    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:46:10.396605    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kindnet-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:46:28.511481    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:46:28.517884    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:46:28.529337    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:46:28.551360    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:46:28.592953    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:46:28.674527    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:46:28.836876    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:46:29.158516    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:46:29.799973    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:46:31.081587    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:46:38.764974    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:46:49.007821    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:47:09.490013    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/calico-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
E1202 20:47:32.318358    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/kindnet-987034/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
helpers_test.go:337: TestStartStop/group/no-preload/serial/AddonExistsAfterStop: WARNING: pod list for "kubernetes-dashboard" "k8s-app=kubernetes-dashboard" returned: Get "https://192.168.76.2:8443/api/v1/namespaces/kubernetes-dashboard/pods?labelSelector=k8s-app%3Dkubernetes-dashboard": dial tcp 192.168.76.2:8443: connect: connection refused
panic: test timed out after 2h0m0s
	running tests:
		TestNetworkPlugins (34m55s)
		TestNetworkPlugins/group/bridge (1m15s)
		TestStartStop (36m34s)
		TestStartStop/group/no-preload (28m13s)
		TestStartStop/group/no-preload/serial (28m13s)
		TestStartStop/group/no-preload/serial/AddonExistsAfterStop (2m21s)

                                                
                                                
goroutine 6467 [running]:
testing.(*M).startAlarm.func1()
	/usr/local/go/src/testing/testing.go:2682 +0x2b0
created by time.goFunc
	/usr/local/go/src/time/sleep.go:215 +0x38

                                                
                                                
goroutine 1 [chan receive, 31 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x40004e8700, 0x40009ddbb8)
	/usr/local/go/src/testing/testing.go:1940 +0x104
testing.runTests(0x40006c0000, {0x534c580, 0x2c, 0x2c}, {0x40009ddd08?, 0x125774?, 0x5374f80?})
	/usr/local/go/src/testing/testing.go:2475 +0x3b8
testing.(*M).Run(0x40008f92c0)
	/usr/local/go/src/testing/testing.go:2337 +0x530
k8s.io/minikube/test/integration.TestMain(0x40008f92c0)
	/home/jenkins/workspace/Build_Cross/test/integration/main_test.go:64 +0xf0
main.main()
	_testmain.go:133 +0x88

                                                
                                                
goroutine 6321 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x4000869010, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4000869000)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001c1dc20)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001a645b0?, 0x21dd4?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40013d9f38, {0x369d680, 0x4001f11b60}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x11?, {0x369d680?, 0x4001f11b60?}, 0x1?, 0x36e5778?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001f1e2a0, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 6318
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 6455 [IO wait]:
internal/poll.runtime_pollWait(0xffff5f6c2000, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001879080?, 0x400156e400?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x4001879080, {0x400156e400, 0x200, 0x200})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0x4000113a88, {0x400156e400?, 0x400146f548?, 0xcc7cc?})
	/usr/local/go/src/os/file.go:144 +0x68
bytes.(*Buffer).ReadFrom(0x40019fc630, {0x369ba58, 0x40003aee10})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
io.copyBuffer({0x369bc40, 0x40019fc630}, {0x369ba58, 0x40003aee10}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x14c
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x4000113a88?, {0x369bc40, 0x40019fc630})
	/usr/local/go/src/os/file.go:295 +0x58
os.(*File).WriteTo(0x4000113a88, {0x369bc40, 0x40019fc630})
	/usr/local/go/src/os/file.go:273 +0x9c
io.copyBuffer({0x369bc40, 0x40019fc630}, {0x369bad8, 0x4000113a88}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x98
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:596 +0x40
os/exec.(*Cmd).Start.func2(0x36f75f0?)
	/usr/local/go/src/os/exec/exec.go:749 +0x30
created by os/exec.(*Cmd).Start in goroutine 3545
	/usr/local/go/src/os/exec/exec.go:748 +0x6a4

                                                
                                                
goroutine 3440 [chan receive, 11 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x40014f2000, 0x339b730)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 3262
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4835 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4834
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 167 [chan receive, 117 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40006c3aa0, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 149
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5135 [chan receive, 7 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40013e48a0, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5133
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4256 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40017e2690, 0x2)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40017e2680)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001674de0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40016d6a10?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x40014c96a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40020bff38, {0x369d680, 0x400169fa40}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40014c97a8?, {0x369d680?, 0x400169fa40?}, 0x90?, 0x4000278480?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40017fbf10, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4253
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 166 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x4000692300?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 149
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 673 [IO wait, 113 minutes]:
internal/poll.runtime_pollWait(0xffff5f6c1200, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x40000ee480?, 0x2d970?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x40000ee480)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x40000ee480)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x40017e2f40)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x40017e2f40)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x40000faf00, {0x36d3120, 0x40017e2f40})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x40000faf00)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 655
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 152 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 151
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 5137 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x400042b6d0, 0xe)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400042b6c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40013e48a0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40003553b0?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40006a5f38, {0x369d680, 0x40017734a0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f3430?, {0x369d680?, 0x40017734a0?}, 0x60?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001931160, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5135
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3898 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x400146af40, 0x40017c9f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x77?, 0x400146af40, 0x400146af88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x0?, 0x400146af50?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3430?, 0x4000224080?, 0x4001c00048?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3907
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 3771 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0x400042a6d0, 0x17)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400042a6c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40019a2c60)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4000179030?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40014eff38, {0x369d680, 0x40014142d0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f3430?, {0x369d680?, 0x40014142d0?}, 0x50?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001930020, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3768
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4500 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x4000279980?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4496
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 3897 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0x4001b4a690, 0x16)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001b4a680)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40013e4d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001a04380?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x400145b6a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40013bbf38, {0x369d680, 0x4000642b10}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x400145b7a8?, {0x369d680?, 0x4000642b10?}, 0xd0?, 0x161f90?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40009d83c0, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3907
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 150 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x400042a350, 0x2d)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400042a340)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40006c3aa0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40002a6310?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x400010cf38, {0x369d680, 0x40009d1800}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f3430?, {0x369d680?, 0x40009d1800?}, 0x30?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40009d8ff0, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 167
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 151 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x400146df40, 0x40006a3f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x7d?, 0x400146df40, 0x400146df88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x161f90?, 0x40004e9a40?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x40003d8a80?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 167
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4030 [chan receive, 2 minutes]:
testing.(*T).Run(0x40015ffa40, {0x2993fff?, 0x40000006ee?}, 0x40000ee000)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0x40015ffa40)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:153 +0x1b8
testing.tRunner(0x40015ffa40, 0x40017fca80)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3460
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3191 [chan receive, 35 minutes]:
testing.(*T).Run(0x40014501c0, {0x296d53a?, 0x5393ed8c183?}, 0x40019602a0)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestNetworkPlugins(0x40014501c0)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:52 +0xe4
testing.tRunner(0x40014501c0, 0x339b500)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3460 [chan receive, 29 minutes]:
testing.(*T).Run(0x40014f2e00, {0x296e9ac?, 0x0?}, 0x40017fca80)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop.func1.1(0x40014f2e00)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:128 +0x7e4
testing.tRunner(0x40014f2e00, 0x40006f8180)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3440
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 4273 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x40018c9740, 0x40018c9788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x0?, 0x40018c9740, 0x40018c9788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x4000178930?, 0x40019a4640?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x40014d9380?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4253
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 857 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0x40017e3c90, 0x2b)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40017e3c80)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40015547e0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40005c4990?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x40014596a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40015a6f38, {0x369d680, 0x400063e450}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40014597a8?, {0x369d680?, 0x400063e450?}, 0x50?, 0x4001460900?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x400008a910, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 845
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 4476 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x40019b1cd0, 0x10)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40019b1cc0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40019a3da0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001a9eb60?, 0x40004dfdd0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x40004dfe40?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40006a7f38, {0x369d680, 0x4001927c50}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40004dfdc0?, {0x369d680?, 0x4001927c50?}, 0x30?, 0x36e5778?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001bdc290, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4501
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3772 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x40014c7740, 0x40017caf88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x90?, 0x40014c7740, 0x40014c7788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x400165ed80?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 3768
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 1534 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x4001a744d0, 0x24)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001a744c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001675b60)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x40004c9570?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40015a2f38, {0x369d680, 0x4001be5890}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f3430?, {0x369d680?, 0x4001be5890?}, 0x70?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001931260, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1540
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 844 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x4000278480?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 843
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1540 [chan receive, 81 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001675b60, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 1538
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1911 [chan send, 80 minutes]:
os/exec.(*Cmd).watchCtx(0x4000279800, 0x40016d7b90)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1910
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5734 [chan receive, 5 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001a6c420, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5729
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5478 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5477
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4501 [chan receive, 11 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40019a3da0, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4496
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 5738 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x4001457740, 0x4001457788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0xb?, 0x4001457740, 0x4001457788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x0?, 0x4001457750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3430?, 0x4000224080?, 0x4001a7d380?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5734
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 6325 [IO wait]:
internal/poll.runtime_pollWait(0xffff5f6c1c00, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x400154f600?, 0x40016c1800?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x400154f600, {0x40016c1800, 0x1800, 0x1800})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
net.(*netFD).Read(0x400154f600, {0x40016c1800?, 0x40016c185a?, 0x5?})
	/usr/local/go/src/net/fd_posix.go:68 +0x28
net.(*conn).Read(0x40003af708, {0x40016c1800?, 0x40014f0888?, 0x8b27c?})
	/usr/local/go/src/net/net.go:196 +0x34
crypto/tls.(*atLeastReader).Read(0x4001f922b8, {0x40016c1800?, 0x40014f08e8?, 0x2cb794?})
	/usr/local/go/src/crypto/tls/conn.go:816 +0x38
bytes.(*Buffer).ReadFrom(0x40018a5428, {0x369dda0, 0x4001f922b8})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
crypto/tls.(*Conn).readFromUntil(0x40018a5188, {0xffff5f485240, 0x4001a67950}, 0x40014f0990?)
	/usr/local/go/src/crypto/tls/conn.go:838 +0xcc
crypto/tls.(*Conn).readRecordOrCCS(0x40018a5188, 0x0)
	/usr/local/go/src/crypto/tls/conn.go:627 +0x340
crypto/tls.(*Conn).readRecord(...)
	/usr/local/go/src/crypto/tls/conn.go:589
crypto/tls.(*Conn).Read(0x40018a5188, {0x4001f34000, 0x1000, 0x542e2c?})
	/usr/local/go/src/crypto/tls/conn.go:1392 +0x14c
bufio.(*Reader).Read(0x4001f236e0, {0x40001b39a0, 0x9, 0x542e44?})
	/usr/local/go/src/bufio/bufio.go:245 +0x188
io.ReadAtLeast({0x369bce0, 0x4001f236e0}, {0x40001b39a0, 0x9, 0x9}, 0x9)
	/usr/local/go/src/io/io.go:335 +0x98
io.ReadFull(...)
	/usr/local/go/src/io/io.go:354
golang.org/x/net/http2.readFrameHeader({0x40001b39a0, 0x9, 0x4001faebd0?}, {0x369bce0?, 0x4001f236e0?})
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.43.0/http2/frame.go:242 +0x58
golang.org/x/net/http2.(*Framer).ReadFrame(0x40001b3960)
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.43.0/http2/frame.go:506 +0x70
golang.org/x/net/http2.(*clientConnReadLoop).run(0x40014f0f98)
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.43.0/http2/transport.go:2258 +0xcc
golang.org/x/net/http2.(*ClientConn).readLoop(0x40015501c0)
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.43.0/http2/transport.go:2127 +0x6c
created by golang.org/x/net/http2.(*Transport).newClientConn in goroutine 6324
	/home/jenkins/go/pkg/mod/golang.org/x/net@v0.43.0/http2/transport.go:912 +0xae0

                                                
                                                
goroutine 858 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x4001457f40, 0x40006a2f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x51?, 0x4001457f40, 0x4001457f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x745372656e696174?, 0x3a22736573757461?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4001580900?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 845
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4813 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x40015fec40?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4812
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 6318 [chan receive, 2 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001c1dc20, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 6311
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 1975 [chan send, 79 minutes]:
os/exec.(*Cmd).watchCtx(0x4000692600, 0x4001c4f490)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1449
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5933 [select]:
k8s.io/apimachinery/pkg/util/wait.loopConditionUntilContext({0x36e5708, 0x4000406410}, {0x36d3780, 0x400191fd80}, 0x1, 0x0, 0x4001539b00)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/loop.go:66 +0x158
k8s.io/apimachinery/pkg/util/wait.PollUntilContextTimeout({0x36e5778?, 0x4000481d50?}, 0x3b9aca00, 0x4001539d28?, 0x1, 0x4001539b00)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:48 +0x8c
k8s.io/minikube/test/integration.PodWait({0x36e5778, 0x4000481d50}, 0x40015fec40, {0x4001a09998, 0x11}, {0x2993faf, 0x14}, {0x29abe76, 0x1c}, 0x7dba821800)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:379 +0x22c
k8s.io/minikube/test/integration.validateAddonAfterStop({0x36e5778, 0x4000481d50}, 0x40015fec40, {0x4001a09998, 0x11}, {0x297850c?, 0x2c570a7200161e84?}, {0x692f4fe6?, 0x400010ef58?}, {0x161f08?, ...})
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:285 +0xd4
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0x40015fec40?)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:154 +0x44
testing.tRunner(0x40015fec40, 0x40000ee000)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 4030
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 1877 [chan send, 80 minutes]:
os/exec.(*Cmd).watchCtx(0x4000278900, 0x40016d6f50)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1876
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5472 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x4000692d80?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5468
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4834 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x4001454740, 0x4001454788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x0?, 0x4001454740, 0x4001454788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x36e5778?, 0x4001a04a10?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x4001a04930?, 0x0?, 0x40014d9200?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4814
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 1099 [chan send, 109 minutes]:
os/exec.(*Cmd).watchCtx(0x4000692a80, 0x4001c4f960)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1098
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 6454 [IO wait]:
internal/poll.runtime_pollWait(0xffff5f3ca600, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x4001878ea0?, 0x4001604c52?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x4001878ea0, {0x4001604c52, 0x3ae, 0x3ae})
	/usr/local/go/src/internal/poll/fd_unix.go:165 +0x1e0
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0x4000113a68, {0x4001604c52?, 0x400146bd48?, 0xcc7cc?})
	/usr/local/go/src/os/file.go:144 +0x68
bytes.(*Buffer).ReadFrom(0x40019fc600, {0x369ba58, 0x40003aee08})
	/usr/local/go/src/bytes/buffer.go:217 +0x90
io.copyBuffer({0x369bc40, 0x40019fc600}, {0x369ba58, 0x40003aee08}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x14c
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x4000113a68?, {0x369bc40, 0x40019fc600})
	/usr/local/go/src/os/file.go:295 +0x58
os.(*File).WriteTo(0x4000113a68, {0x369bc40, 0x40019fc600})
	/usr/local/go/src/os/file.go:273 +0x9c
io.copyBuffer({0x369bc40, 0x40019fc600}, {0x369bad8, 0x4000113a68}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x98
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:596 +0x40
os/exec.(*Cmd).Start.func2(0x4000693e00?)
	/usr/local/go/src/os/exec/exec.go:749 +0x30
created by os/exec.(*Cmd).Start in goroutine 3545
	/usr/local/go/src/os/exec/exec.go:748 +0x6a4

                                                
                                                
goroutine 5739 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5738
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 1026 [chan send, 109 minutes]:
os/exec.(*Cmd).watchCtx(0x4001be6180, 0x4001a04310)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 1025
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5476 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0x400042a890, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400042a880)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40018795c0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001c4f0a0?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x40014c46a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40006a0f38, {0x369d680, 0x400064c6c0}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x40014c47a8?, {0x369d680?, 0x400064c6c0?}, 0x0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40019c8fc0, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5473
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 1109 [select, 109 minutes]:
net/http.(*persistConn).writeLoop(0x40013e07e0)
	/usr/local/go/src/net/http/transport.go:2600 +0x94
created by net/http.(*Transport).dialConn in goroutine 1106
	/usr/local/go/src/net/http/transport.go:1948 +0x1164

                                                
                                                
goroutine 6008 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0x400042b1d0, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x400042b1c0)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001878fc0)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4001a9f570?, 0x21dd4?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x40000a76a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40006a4f38, {0x369d680, 0x4001a9da40}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x11?, {0x369d680?, 0x4001a9da40?}, 0x0?, 0x36e5778?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40019c8b50, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 6026
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 1108 [select, 109 minutes]:
net/http.(*persistConn).readLoop(0x40013e07e0)
	/usr/local/go/src/net/http/transport.go:2398 +0xa6c
created by net/http.(*Transport).dialConn in goroutine 1106
	/usr/local/go/src/net/http/transport.go:1947 +0x111c

                                                
                                                
goroutine 1536 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 1535
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 5138 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x400146df40, 0x400146df88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x0?, 0x400146df40, 0x400146df88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x36e5778?, 0x40016d7030?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x40016d6ee0?, 0x0?, 0x40015ff340?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5135
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 859 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 858
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 1066 [chan send, 109 minutes]:
os/exec.(*Cmd).watchCtx(0x4001c47200, 0x4001c4ed20)
	/usr/local/go/src/os/exec/exec.go:814 +0x280
created by os/exec.(*Cmd).Start in goroutine 766
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 5473 [chan receive, 5 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40018795c0, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 5468
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4477 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x400146b740, 0x400146b788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x30?, 0x400146b740, 0x400146b788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x40014d9c80?, 0x40002a23c0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x40014d9800?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4501
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 4274 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4273
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 845 [chan receive, 111 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40015547e0, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 843
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4252 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x4001550540?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 4251
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 5477 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x4001458740, 0x4001458788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x10?, 0x4001458740, 0x4001458788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x40004b0b00?, 0x40004b0b00?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4000279800?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5473
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 1303 [IO wait, 109 minutes]:
internal/poll.runtime_pollWait(0xffff5f6c1a00, 0x72)
	/usr/local/go/src/runtime/netpoll.go:351 +0xa0
internal/poll.(*pollDesc).wait(0x40017fce00?, 0xdbd0c?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x28
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x40017fce00)
	/usr/local/go/src/internal/poll/fd_unix.go:613 +0x21c
net.(*netFD).accept(0x40017fce00)
	/usr/local/go/src/net/fd_unix.go:161 +0x28
net.(*TCPListener).accept(0x40017e2340)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x24
net.(*TCPListener).Accept(0x40017e2340)
	/usr/local/go/src/net/tcpsock.go:380 +0x2c
net/http.(*Server).Serve(0x40017aee00, {0x36d3120, 0x40017e2340})
	/usr/local/go/src/net/http/server.go:3463 +0x24c
net/http.(*Server).ListenAndServe(0x40017aee00)
	/usr/local/go/src/net/http/server.go:3389 +0x80
k8s.io/minikube/test/integration.startHTTPProxy.func1(...)
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2218
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 1301
	/home/jenkins/workspace/Build_Cross/test/integration/functional_test.go:2217 +0x104

                                                
                                                
goroutine 3509 [chan receive, 11 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1891 +0x3d0
testing.tRunner(0x4001c9a540, 0x40019602a0)
	/usr/local/go/src/testing/testing.go:1940 +0x104
created by testing.(*T).Run in goroutine 3191
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 3768 [chan receive, 33 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40019a2c60, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3763
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 3767 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x4001c9a380?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3763
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1535 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x40018cbf40, 0x400010af88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x8?, 0x40018cbf40, 0x40018cbf88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x0?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x95c64?, 0x4000279980?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 1540
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 5134 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x40015501c0?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5133
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 6025 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x4001c9a000?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 6024
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 3899 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3898
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 3545 [syscall]:
syscall.Syscall6(0x5f, 0x3, 0x12, 0x40014e19a8, 0x4, 0x4001807320, 0x0)
	/usr/local/go/src/syscall/syscall_linux.go:96 +0x2c
internal/syscall/unix.Waitid(0x40014e1b08?, 0x1929a0?, 0xffffebc4a14b?, 0x0?, 0x40019b1e40?)
	/usr/local/go/src/internal/syscall/unix/waitid_linux.go:18 +0x44
os.(*Process).pidfdWait.func1(...)
	/usr/local/go/src/os/pidfd_linux.go:109
os.ignoringEINTR(...)
	/usr/local/go/src/os/file_posix.go:256
os.(*Process).pidfdWait(0x40019b1ec0)
	/usr/local/go/src/os/pidfd_linux.go:108 +0x144
os.(*Process).wait(0x40014e1ad8?)
	/usr/local/go/src/os/exec_unix.go:25 +0x24
os.(*Process).Wait(...)
	/usr/local/go/src/os/exec.go:340
os/exec.(*Cmd).Wait(0x4000279680)
	/usr/local/go/src/os/exec/exec.go:922 +0x38
os/exec.(*Cmd).Run(0x4000279680)
	/usr/local/go/src/os/exec/exec.go:626 +0x38
k8s.io/minikube/test/integration.Run(0x4001451c00, 0x4000279680)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:103 +0x154
k8s.io/minikube/test/integration.Cleanup(0x4001451c00, {0x40002e4c80, 0xd}, 0x40019c9c70)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:178 +0x114
k8s.io/minikube/test/integration.CleanupWithLogs(0x4001451c00, {0x40002e4c80, 0xd}, 0x40019c9c70)
	/home/jenkins/workspace/Build_Cross/test/integration/helpers_test.go:192 +0x120
k8s.io/minikube/test/integration.TestNetworkPlugins.func1.1(0x4001451c00)
	/home/jenkins/workspace/Build_Cross/test/integration/net_test.go:211 +0x990
testing.tRunner(0x4001451c00, 0x40001ac680)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 3509
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 6317 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x40019b9818?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 6311
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4253 [chan receive, 11 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001674de0, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4251
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 6026 [chan receive, 2 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x4001878fc0, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 6024
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 6456 [select]:
os/exec.(*Cmd).watchCtx(0x4000279680, 0x40015ae0e0)
	/usr/local/go/src/os/exec/exec.go:789 +0x70
created by os/exec.(*Cmd).Start in goroutine 3545
	/usr/local/go/src/os/exec/exec.go:775 +0x678

                                                
                                                
goroutine 3773 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3772
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 5139 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 5138
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 6009 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x400146d740, 0x400146d788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x72?, 0x400146d740, 0x400146d788)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x0?, 0x400146d750?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3430?, 0x4000224080?, 0x4001c9a000?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 6026
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                                
goroutine 6323 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 6322
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 6010 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 6009
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 4478 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:297 +0x13c
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 4477
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:280 +0xb8

                                                
                                                
goroutine 5733 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x4001a7d380?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 5729
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 3906 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x4001c00048?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 3893
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 1539 [select]:
k8s.io/client-go/util/workqueue.(*delayingType[...]).waitingLoop(0x36fe880, {{0x36f3430, 0x4000224080?}, 0x4000692600?})
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:320 +0x288
created by k8s.io/client-go/util/workqueue.newDelayingQueue[...] in goroutine 1538
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/delaying_queue.go:157 +0x204

                                                
                                                
goroutine 4814 [chan receive, 9 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40019a2e40, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 4812
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 4833 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0x4001a75b50, 0x10)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x4001a75b40)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x40019a2e40)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x4000263490?, 0x0?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40006a1f38, {0x369d680, 0x4001c40f60}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x36f3430?, {0x369d680?, 0x4001c40f60?}, 0x60?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x40013faea0, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 4814
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 5737 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0x40008f2290, 0x0)
	/usr/local/go/src/runtime/sema.go:606 +0x140
sync.(*Cond).Wait(0x40008f2280)
	/usr/local/go/src/sync/cond.go:71 +0xa4
k8s.io/client-go/util/workqueue.(*Typed[...]).Get(0x3701d80)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/util/workqueue/queue.go:277 +0x80
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0x4001a6c420)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:160 +0x38
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:155
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1({0x400152f2d0?, 0x1618bc?})
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x24
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext.func1({0x36e5b10?, 0x4000106380?}, 0x400146f6a8?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:255 +0x58
k8s.io/apimachinery/pkg/util/wait.BackoffUntilWithContext({0x36e5b10, 0x4000106380}, 0x40015a4f38, {0x369d680, 0x4000789d70}, 0x1)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:256 +0xac
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x400146f7a8?, {0x369d680?, 0x4000789d70?}, 0x50?, 0x0?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:233 +0x4c
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x4001a88480, 0x3b9aca00, 0x0, 0x1, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:210 +0x7c
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/backoff.go:163
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 5734
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:144 +0x174

                                                
                                                
goroutine 3907 [chan receive, 31 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).run(0x40013e4d80, 0x4000106380)
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:151 +0x218
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3893
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cache.go:126 +0x4d0

                                                
                                                
goroutine 3262 [chan receive, 37 minutes]:
testing.(*T).Run(0x4001451180, {0x296d53a?, 0x40015a7f58?}, 0x339b730)
	/usr/local/go/src/testing/testing.go:2005 +0x378
k8s.io/minikube/test/integration.TestStartStop(0x4001451180)
	/home/jenkins/workspace/Build_Cross/test/integration/start_stop_delete_test.go:46 +0x3c
testing.tRunner(0x4001451180, 0x339b548)
	/usr/local/go/src/testing/testing.go:1934 +0xc8
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1997 +0x364

                                                
                                                
goroutine 6322 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0x36e5b10, 0x4000106380}, 0x4001459f40, 0x4001459f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/wait.go:210 +0xac
k8s.io/apimachinery/pkg/util/wait.poll({0x36e5b10, 0x4000106380}, 0x83?, 0x4001459f40, 0x4001459f88)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:260 +0x8c
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0x36e5b10?, 0x4000106380?}, 0x0?, 0x4001459f50?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:200 +0x40
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x36f3430?, 0x4000224080?, 0x40019b9818?)
	/home/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.33.4/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).run in goroutine 6318
	/home/jenkins/go/pkg/mod/k8s.io/client-go@v0.33.4/transport/cert_rotation.go:146 +0x20c

                                                
                                    

Test pass (262/321)

Order passed test Duration
3 TestDownloadOnly/v1.28.0/json-events 10.3
4 TestDownloadOnly/v1.28.0/preload-exists 0
8 TestDownloadOnly/v1.28.0/LogsDuration 0.35
9 TestDownloadOnly/v1.28.0/DeleteAll 0.33
10 TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds 0.22
12 TestDownloadOnly/v1.34.2/json-events 7.01
13 TestDownloadOnly/v1.34.2/preload-exists 0
17 TestDownloadOnly/v1.34.2/LogsDuration 0.09
18 TestDownloadOnly/v1.34.2/DeleteAll 0.21
19 TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds 0.14
21 TestDownloadOnly/v1.35.0-beta.0/json-events 2.44
23 TestDownloadOnly/v1.35.0-beta.0/cached-images 0
24 TestDownloadOnly/v1.35.0-beta.0/binaries 0
26 TestDownloadOnly/v1.35.0-beta.0/LogsDuration 0.08
27 TestDownloadOnly/v1.35.0-beta.0/DeleteAll 0.23
28 TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds 0.13
30 TestBinaryMirror 0.61
34 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.08
35 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.07
36 TestAddons/Setup 154.91
38 TestAddons/serial/Volcano 40.75
40 TestAddons/serial/GCPAuth/Namespaces 0.19
41 TestAddons/serial/GCPAuth/FakeCredentials 8.86
44 TestAddons/parallel/Registry 17.45
45 TestAddons/parallel/RegistryCreds 0.74
46 TestAddons/parallel/Ingress 19.94
47 TestAddons/parallel/InspektorGadget 10.92
48 TestAddons/parallel/MetricsServer 6.85
50 TestAddons/parallel/CSI 31.29
51 TestAddons/parallel/Headlamp 17.19
52 TestAddons/parallel/CloudSpanner 5.59
53 TestAddons/parallel/LocalPath 51.08
54 TestAddons/parallel/NvidiaDevicePlugin 5.53
55 TestAddons/parallel/Yakd 11.83
57 TestAddons/StoppedEnableDisable 12.35
58 TestCertOptions 37.44
59 TestCertExpiration 233.32
61 TestForceSystemdFlag 42.58
62 TestForceSystemdEnv 44.96
63 TestDockerEnvContainerd 48.4
67 TestErrorSpam/setup 33.25
68 TestErrorSpam/start 0.8
69 TestErrorSpam/status 1.2
70 TestErrorSpam/pause 1.84
71 TestErrorSpam/unpause 1.98
72 TestErrorSpam/stop 1.58
75 TestFunctional/serial/CopySyncFile 0
76 TestFunctional/serial/StartWithProxy 76.63
77 TestFunctional/serial/AuditLog 0
78 TestFunctional/serial/SoftStart 7.46
79 TestFunctional/serial/KubeContext 0.07
80 TestFunctional/serial/KubectlGetPods 0.13
83 TestFunctional/serial/CacheCmd/cache/add_remote 3.65
84 TestFunctional/serial/CacheCmd/cache/add_local 1.33
85 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.07
86 TestFunctional/serial/CacheCmd/cache/list 0.06
87 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.31
88 TestFunctional/serial/CacheCmd/cache/cache_reload 1.89
89 TestFunctional/serial/CacheCmd/cache/delete 0.13
90 TestFunctional/serial/MinikubeKubectlCmd 0.14
91 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.15
92 TestFunctional/serial/ExtraConfig 53.93
93 TestFunctional/serial/ComponentHealth 0.1
94 TestFunctional/serial/LogsCmd 1.49
95 TestFunctional/serial/LogsFileCmd 1.48
96 TestFunctional/serial/InvalidService 4.93
98 TestFunctional/parallel/ConfigCmd 0.45
99 TestFunctional/parallel/DashboardCmd 6.26
100 TestFunctional/parallel/DryRun 0.56
101 TestFunctional/parallel/InternationalLanguage 0.23
102 TestFunctional/parallel/StatusCmd 1.34
106 TestFunctional/parallel/ServiceCmdConnect 8.76
107 TestFunctional/parallel/AddonsCmd 0.17
108 TestFunctional/parallel/PersistentVolumeClaim 23.88
110 TestFunctional/parallel/SSHCmd 0.73
111 TestFunctional/parallel/CpCmd 2.56
113 TestFunctional/parallel/FileSync 0.34
114 TestFunctional/parallel/CertSync 2.27
118 TestFunctional/parallel/NodeLabels 0.12
120 TestFunctional/parallel/NonActiveRuntimeDisabled 0.75
122 TestFunctional/parallel/License 0.42
124 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.8
125 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0
127 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 8.47
128 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.12
129 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0
133 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.11
134 TestFunctional/parallel/ServiceCmd/DeployApp 7.25
135 TestFunctional/parallel/ProfileCmd/profile_not_create 0.47
136 TestFunctional/parallel/ServiceCmd/List 0.63
137 TestFunctional/parallel/ProfileCmd/profile_list 0.55
138 TestFunctional/parallel/ProfileCmd/profile_json_output 0.54
139 TestFunctional/parallel/ServiceCmd/JSONOutput 0.63
140 TestFunctional/parallel/MountCmd/any-port 8.8
141 TestFunctional/parallel/ServiceCmd/HTTPS 0.59
142 TestFunctional/parallel/ServiceCmd/Format 0.41
143 TestFunctional/parallel/ServiceCmd/URL 0.51
144 TestFunctional/parallel/MountCmd/specific-port 2.33
145 TestFunctional/parallel/Version/short 0.08
146 TestFunctional/parallel/Version/components 1.35
147 TestFunctional/parallel/MountCmd/VerifyCleanup 2.33
148 TestFunctional/parallel/ImageCommands/ImageListShort 0.34
149 TestFunctional/parallel/ImageCommands/ImageListTable 0.3
150 TestFunctional/parallel/ImageCommands/ImageListJson 0.32
151 TestFunctional/parallel/ImageCommands/ImageListYaml 0.33
152 TestFunctional/parallel/ImageCommands/ImageBuild 4
153 TestFunctional/parallel/ImageCommands/Setup 0.69
154 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 1.3
155 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 1.34
156 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 1.61
157 TestFunctional/parallel/UpdateContextCmd/no_changes 0.2
158 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.28
159 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.19
160 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.43
161 TestFunctional/parallel/ImageCommands/ImageRemove 0.51
162 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 0.7
163 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 0.48
164 TestFunctional/delete_echo-server_images 0.04
165 TestFunctional/delete_my-image_image 0.02
166 TestFunctional/delete_minikube_cached_images 0.02
170 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile 0
172 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog 0
174 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext 0.07
178 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote 3.39
179 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local 1.08
180 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete 0.06
181 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list 0.05
182 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node 0.29
183 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload 1.83
184 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete 0.12
189 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd 1.25
190 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd 1.03
193 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd 0.41
195 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun 0.47
196 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage 0.2
202 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd 0.14
205 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd 0.71
206 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd 2.22
208 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync 0.28
209 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync 1.72
215 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled 0.53
217 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License 0.19
220 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel 0
227 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel 0.1
234 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create 0.39
235 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list 0.4
236 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output 0.38
238 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port 2.1
239 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup 1.85
240 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short 0.08
241 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components 0.48
242 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort 0.23
243 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable 0.22
244 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson 0.22
245 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml 0.25
246 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild 3.68
247 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup 0.25
248 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon 1.12
249 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon 1.09
250 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon 1.6
251 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile 0.34
252 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove 0.48
253 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile 0.68
254 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon 0.38
255 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes 0.14
256 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster 0.15
257 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters 0.17
258 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images 0.05
259 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image 0.02
260 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images 0.01
264 TestMultiControlPlane/serial/StartCluster 197.71
265 TestMultiControlPlane/serial/DeployApp 7.6
266 TestMultiControlPlane/serial/PingHostFromPods 1.68
267 TestMultiControlPlane/serial/AddWorkerNode 60.11
268 TestMultiControlPlane/serial/NodeLabels 0.11
269 TestMultiControlPlane/serial/HAppyAfterClusterStart 1.1
270 TestMultiControlPlane/serial/CopyFile 20.72
271 TestMultiControlPlane/serial/StopSecondaryNode 12.97
272 TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop 0.86
273 TestMultiControlPlane/serial/RestartSecondaryNode 14.67
274 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart 1.25
275 TestMultiControlPlane/serial/RestartClusterKeepsNodes 98.97
276 TestMultiControlPlane/serial/DeleteSecondaryNode 11.28
277 TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete 0.78
278 TestMultiControlPlane/serial/StopCluster 36.31
279 TestMultiControlPlane/serial/RestartCluster 60.5
280 TestMultiControlPlane/serial/DegradedAfterClusterRestart 0.83
281 TestMultiControlPlane/serial/AddSecondaryNode 90.35
282 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd 1.12
287 TestJSONOutput/start/Command 81.52
288 TestJSONOutput/start/Audit 0
290 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
291 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
293 TestJSONOutput/pause/Command 0.74
294 TestJSONOutput/pause/Audit 0
296 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
297 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
299 TestJSONOutput/unpause/Command 0.64
300 TestJSONOutput/unpause/Audit 0
302 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
303 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
305 TestJSONOutput/stop/Command 6.01
306 TestJSONOutput/stop/Audit 0
308 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
309 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
310 TestErrorJSONOutput 0.25
312 TestKicCustomNetwork/create_custom_network 39.88
313 TestKicCustomNetwork/use_default_bridge_network 35.19
314 TestKicExistingNetwork 35.66
315 TestKicCustomSubnet 34.13
316 TestKicStaticIP 35.99
317 TestMainNoArgs 0.05
318 TestMinikubeProfile 71.88
321 TestMountStart/serial/StartWithMountFirst 8.16
322 TestMountStart/serial/VerifyMountFirst 0.28
323 TestMountStart/serial/StartWithMountSecond 8.54
324 TestMountStart/serial/VerifyMountSecond 0.27
325 TestMountStart/serial/DeleteFirst 1.72
326 TestMountStart/serial/VerifyMountPostDelete 0.28
327 TestMountStart/serial/Stop 1.28
328 TestMountStart/serial/RestartStopped 7.94
329 TestMountStart/serial/VerifyMountPostStop 0.29
332 TestMultiNode/serial/FreshStart2Nodes 106.09
333 TestMultiNode/serial/DeployApp2Nodes 5.01
334 TestMultiNode/serial/PingHostFrom2Pods 1.07
335 TestMultiNode/serial/AddNode 57.52
336 TestMultiNode/serial/MultiNodeLabels 0.1
337 TestMultiNode/serial/ProfileList 0.75
338 TestMultiNode/serial/CopyFile 10.71
339 TestMultiNode/serial/StopNode 2.47
340 TestMultiNode/serial/StartAfterStop 8.07
341 TestMultiNode/serial/RestartKeepsNodes 75.57
342 TestMultiNode/serial/DeleteNode 5.76
343 TestMultiNode/serial/StopMultiNode 24.11
344 TestMultiNode/serial/RestartMultiNode 51.74
345 TestMultiNode/serial/ValidateNameConflict 39.7
350 TestPreload 125.34
352 TestScheduledStopUnix 108.11
355 TestInsufficientStorage 12.55
356 TestRunningBinaryUpgrade 323.11
359 TestMissingContainerUpgrade 175.24
361 TestNoKubernetes/serial/StartNoK8sWithVersion 0.09
362 TestNoKubernetes/serial/StartWithK8s 42.48
363 TestNoKubernetes/serial/StartWithStopK8s 25.61
364 TestNoKubernetes/serial/Start 9.4
365 TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads 0
366 TestNoKubernetes/serial/VerifyK8sNotRunning 0.43
367 TestNoKubernetes/serial/ProfileList 3.18
368 TestNoKubernetes/serial/Stop 1.3
369 TestNoKubernetes/serial/StartNoArgs 6.91
370 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.28
371 TestStoppedBinaryUpgrade/Setup 11.21
372 TestStoppedBinaryUpgrade/Upgrade 305.96
373 TestStoppedBinaryUpgrade/MinikubeLogs 2.16
382 TestPause/serial/Start 82.91
383 TestPause/serial/SecondStartNoReconfiguration 7.4
384 TestPause/serial/Pause 1.09
385 TestPause/serial/VerifyStatus 0.54
386 TestPause/serial/Unpause 1.01
387 TestPause/serial/PauseAgain 1.27
388 TestPause/serial/DeletePaused 3.81
389 TestPause/serial/VerifyDeletedResources 1.38
x
+
TestDownloadOnly/v1.28.0/json-events (10.3s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-132306 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-132306 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (10.297275932s)
--- PASS: TestDownloadOnly/v1.28.0/json-events (10.30s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/preload-exists
I1202 18:47:58.166861    4435 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
I1202 18:47:58.166943    4435 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.28.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/LogsDuration (0.35s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-132306
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-132306: exit status 85 (345.469105ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬──────────┐
	│ COMMAND │                                                                                         ARGS                                                                                          │       PROFILE        │  USER   │ VERSION │     START TIME      │ END TIME │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼──────────┤
	│ start   │ -o=json --download-only -p download-only-132306 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-132306 │ jenkins │ v1.37.0 │ 02 Dec 25 18:47 UTC │          │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴──────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 18:47:47
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 18:47:47.925526    4440 out.go:360] Setting OutFile to fd 1 ...
	I1202 18:47:47.925752    4440 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:47:47.925779    4440 out.go:374] Setting ErrFile to fd 2...
	I1202 18:47:47.925798    4440 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:47:47.926087    4440 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	W1202 18:47:47.926258    4440 root.go:314] Error reading config file at /home/jenkins/minikube-integration/22021-2487/.minikube/config/config.json: open /home/jenkins/minikube-integration/22021-2487/.minikube/config/config.json: no such file or directory
	I1202 18:47:47.926716    4440 out.go:368] Setting JSON to true
	I1202 18:47:47.927544    4440 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":1804,"bootTime":1764699464,"procs":150,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 18:47:47.927639    4440 start.go:143] virtualization:  
	I1202 18:47:47.933323    4440 out.go:99] [download-only-132306] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	W1202 18:47:47.933544    4440 preload.go:354] Failed to list preload files: open /home/jenkins/minikube-integration/22021-2487/.minikube/cache/preloaded-tarball: no such file or directory
	I1202 18:47:47.933660    4440 notify.go:221] Checking for updates...
	I1202 18:47:47.937420    4440 out.go:171] MINIKUBE_LOCATION=22021
	I1202 18:47:47.941100    4440 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 18:47:47.944301    4440 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 18:47:47.947847    4440 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 18:47:47.950937    4440 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1202 18:47:47.956919    4440 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1202 18:47:47.957275    4440 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 18:47:47.980721    4440 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 18:47:47.980839    4440 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 18:47:48.412446    4440 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-12-02 18:47:48.402960901 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 18:47:48.412555    4440 docker.go:319] overlay module found
	I1202 18:47:48.415624    4440 out.go:99] Using the docker driver based on user configuration
	I1202 18:47:48.415659    4440 start.go:309] selected driver: docker
	I1202 18:47:48.415665    4440 start.go:927] validating driver "docker" against <nil>
	I1202 18:47:48.415769    4440 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 18:47:48.472701    4440 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:61 SystemTime:2025-12-02 18:47:48.463871863 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 18:47:48.472857    4440 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1202 18:47:48.473142    4440 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1202 18:47:48.473313    4440 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1202 18:47:48.476394    4440 out.go:171] Using Docker driver with root privileges
	I1202 18:47:48.479484    4440 cni.go:84] Creating CNI manager for ""
	I1202 18:47:48.479569    4440 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 18:47:48.479585    4440 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1202 18:47:48.479662    4440 start.go:353] cluster config:
	{Name:download-only-132306 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.0 ClusterName:download-only-132306 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 18:47:48.482654    4440 out.go:99] Starting "download-only-132306" primary control-plane node in "download-only-132306" cluster
	I1202 18:47:48.482673    4440 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 18:47:48.485633    4440 out.go:99] Pulling base image v0.0.48-1764169655-21974 ...
	I1202 18:47:48.485682    4440 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1202 18:47:48.485829    4440 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 18:47:48.502424    4440 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b to local cache
	I1202 18:47:48.502602    4440 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local cache directory
	I1202 18:47:48.502711    4440 image.go:150] Writing gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b to local cache
	I1202 18:47:48.597216    4440 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
	I1202 18:47:48.597246    4440 cache.go:65] Caching tarball of preloaded images
	I1202 18:47:48.597407    4440 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1202 18:47:48.600821    4440 out.go:99] Downloading Kubernetes v1.28.0 preload ...
	I1202 18:47:48.600851    4440 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4 from gcs api...
	I1202 18:47:48.688889    4440 preload.go:295] Got checksum from GCS API "38d7f581f2fa4226c8af2c9106b982b7"
	I1202 18:47:48.689020    4440 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.0/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4?checksum=md5:38d7f581f2fa4226c8af2c9106b982b7 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.0-containerd-overlay2-arm64.tar.lz4
	I1202 18:47:56.025761    4440 cache.go:68] Finished verifying existence of preloaded tar for v1.28.0 on containerd
	I1202 18:47:56.026328    4440 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/download-only-132306/config.json ...
	I1202 18:47:56.026367    4440 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/download-only-132306/config.json: {Name:mk70fd92167ed936abf74b665057a0625248ea89 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I1202 18:47:56.026548    4440 preload.go:188] Checking if preload exists for k8s version v1.28.0 and runtime containerd
	I1202 18:47:56.026832    4440 download.go:108] Downloading: https://dl.k8s.io/release/v1.28.0/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.28.0/bin/linux/arm64/kubectl.sha256 -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/linux/arm64/v1.28.0/kubectl
	
	
	* The control-plane node download-only-132306 host does not exist
	  To start a cluster, run: "minikube start -p download-only-132306"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.28.0/LogsDuration (0.35s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAll (0.33s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.28.0/DeleteAll (0.33s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-132306
--- PASS: TestDownloadOnly/v1.28.0/DeleteAlwaysSucceeds (0.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/json-events (7.01s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-423907 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-423907 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (7.011150534s)
--- PASS: TestDownloadOnly/v1.34.2/json-events (7.01s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/preload-exists
I1202 18:48:06.076943    4435 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
I1202 18:48:06.076977    4435 preload.go:203] Found local preload: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
--- PASS: TestDownloadOnly/v1.34.2/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/LogsDuration (0.09s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-423907
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-423907: exit status 85 (88.282572ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                         ARGS                                                                                          │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-132306 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-132306 │ jenkins │ v1.37.0 │ 02 Dec 25 18:47 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                 │ minikube             │ jenkins │ v1.37.0 │ 02 Dec 25 18:47 UTC │ 02 Dec 25 18:47 UTC │
	│ delete  │ -p download-only-132306                                                                                                                                                               │ download-only-132306 │ jenkins │ v1.37.0 │ 02 Dec 25 18:47 UTC │ 02 Dec 25 18:47 UTC │
	│ start   │ -o=json --download-only -p download-only-423907 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-423907 │ jenkins │ v1.37.0 │ 02 Dec 25 18:47 UTC │                     │
	└─────────┴───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 18:47:59
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 18:47:59.105488    4643 out.go:360] Setting OutFile to fd 1 ...
	I1202 18:47:59.105994    4643 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:47:59.106043    4643 out.go:374] Setting ErrFile to fd 2...
	I1202 18:47:59.106062    4643 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:47:59.106437    4643 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 18:47:59.106954    4643 out.go:368] Setting JSON to true
	I1202 18:47:59.107746    4643 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":1815,"bootTime":1764699464,"procs":143,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 18:47:59.107860    4643 start.go:143] virtualization:  
	I1202 18:47:59.118636    4643 out.go:99] [download-only-423907] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 18:47:59.118938    4643 notify.go:221] Checking for updates...
	I1202 18:47:59.128175    4643 out.go:171] MINIKUBE_LOCATION=22021
	I1202 18:47:59.137816    4643 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 18:47:59.146949    4643 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 18:47:59.168541    4643 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 18:47:59.191971    4643 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1202 18:47:59.240564    4643 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1202 18:47:59.240877    4643 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 18:47:59.261802    4643 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 18:47:59.261908    4643 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 18:47:59.333262    4643 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-02 18:47:59.323274531 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 18:47:59.333370    4643 docker.go:319] overlay module found
	I1202 18:47:59.344074    4643 out.go:99] Using the docker driver based on user configuration
	I1202 18:47:59.344151    4643 start.go:309] selected driver: docker
	I1202 18:47:59.344169    4643 start.go:927] validating driver "docker" against <nil>
	I1202 18:47:59.344302    4643 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 18:47:59.413618    4643 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-02 18:47:59.403251769 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 18:47:59.413771    4643 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1202 18:47:59.414070    4643 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1202 18:47:59.414211    4643 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1202 18:47:59.426544    4643 out.go:171] Using Docker driver with root privileges
	I1202 18:47:59.435760    4643 cni.go:84] Creating CNI manager for ""
	I1202 18:47:59.435836    4643 cni.go:143] "docker" driver + "containerd" runtime found, recommending kindnet
	I1202 18:47:59.435848    4643 start_flags.go:336] Found "CNI" CNI - setting NetworkPlugin=cni
	I1202 18:47:59.435939    4643 start.go:353] cluster config:
	{Name:download-only-423907 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:3072 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:download-only-423907 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Co
ntainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 18:47:59.444262    4643 out.go:99] Starting "download-only-423907" primary control-plane node in "download-only-423907" cluster
	I1202 18:47:59.444296    4643 cache.go:134] Beginning downloading kic base image for docker with containerd
	I1202 18:47:59.452922    4643 out.go:99] Pulling base image v0.0.48-1764169655-21974 ...
	I1202 18:47:59.452987    4643 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1202 18:47:59.453085    4643 image.go:81] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local docker daemon
	I1202 18:47:59.469194    4643 cache.go:163] Downloading gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b to local cache
	I1202 18:47:59.469345    4643 image.go:65] Checking for gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local cache directory
	I1202 18:47:59.469367    4643 image.go:68] Found gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b in local cache directory, skipping pull
	I1202 18:47:59.469372    4643 image.go:137] gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b exists in cache, skipping pull
	I1202 18:47:59.469383    4643 cache.go:166] successfully saved gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b as a tarball
	I1202 18:47:59.511434    4643 preload.go:148] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.34.2/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	I1202 18:47:59.511481    4643 cache.go:65] Caching tarball of preloaded images
	I1202 18:47:59.511654    4643 preload.go:188] Checking if preload exists for k8s version v1.34.2 and runtime containerd
	I1202 18:47:59.518498    4643 out.go:99] Downloading Kubernetes v1.34.2 preload ...
	I1202 18:47:59.518533    4643 preload.go:318] getting checksum for preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4 from gcs api...
	I1202 18:47:59.603416    4643 preload.go:295] Got checksum from GCS API "cd1a05d5493c9270e248bf47fb3f071d"
	I1202 18:47:59.603471    4643 download.go:108] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.34.2/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4?checksum=md5:cd1a05d5493c9270e248bf47fb3f071d -> /home/jenkins/minikube-integration/22021-2487/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.34.2-containerd-overlay2-arm64.tar.lz4
	
	
	* The control-plane node download-only-423907 host does not exist
	  To start a cluster, run: "minikube start -p download-only-423907"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.34.2/LogsDuration (0.09s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/DeleteAll (0.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.34.2/DeleteAll (0.21s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-423907
--- PASS: TestDownloadOnly/v1.34.2/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/json-events (2.44s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/json-events
aaa_download_only_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -o=json --download-only -p download-only-178568 --force --alsologtostderr --kubernetes-version=v1.35.0-beta.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -o=json --download-only -p download-only-178568 --force --alsologtostderr --kubernetes-version=v1.35.0-beta.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd: (2.437472435s)
--- PASS: TestDownloadOnly/v1.35.0-beta.0/json-events (2.44s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/cached-images
--- PASS: TestDownloadOnly/v1.35.0-beta.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/binaries
--- PASS: TestDownloadOnly/v1.35.0-beta.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/LogsDuration (0.08s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/LogsDuration
aaa_download_only_test.go:183: (dbg) Run:  out/minikube-linux-arm64 logs -p download-only-178568
aaa_download_only_test.go:183: (dbg) Non-zero exit: out/minikube-linux-arm64 logs -p download-only-178568: exit status 85 (82.466617ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	┌─────────┬──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┬──────────────────────┬─────────┬─────────┬─────────────────────┬─────────────────────┐
	│ COMMAND │                                                                                             ARGS                                                                                             │       PROFILE        │  USER   │ VERSION │     START TIME      │      END TIME       │
	├─────────┼──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┼──────────────────────┼─────────┼─────────┼─────────────────────┼─────────────────────┤
	│ start   │ -o=json --download-only -p download-only-132306 --force --alsologtostderr --kubernetes-version=v1.28.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd        │ download-only-132306 │ jenkins │ v1.37.0 │ 02 Dec 25 18:47 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                        │ minikube             │ jenkins │ v1.37.0 │ 02 Dec 25 18:47 UTC │ 02 Dec 25 18:47 UTC │
	│ delete  │ -p download-only-132306                                                                                                                                                                      │ download-only-132306 │ jenkins │ v1.37.0 │ 02 Dec 25 18:47 UTC │ 02 Dec 25 18:47 UTC │
	│ start   │ -o=json --download-only -p download-only-423907 --force --alsologtostderr --kubernetes-version=v1.34.2 --container-runtime=containerd --driver=docker  --container-runtime=containerd        │ download-only-423907 │ jenkins │ v1.37.0 │ 02 Dec 25 18:47 UTC │                     │
	│ delete  │ --all                                                                                                                                                                                        │ minikube             │ jenkins │ v1.37.0 │ 02 Dec 25 18:48 UTC │ 02 Dec 25 18:48 UTC │
	│ delete  │ -p download-only-423907                                                                                                                                                                      │ download-only-423907 │ jenkins │ v1.37.0 │ 02 Dec 25 18:48 UTC │ 02 Dec 25 18:48 UTC │
	│ start   │ -o=json --download-only -p download-only-178568 --force --alsologtostderr --kubernetes-version=v1.35.0-beta.0 --container-runtime=containerd --driver=docker  --container-runtime=containerd │ download-only-178568 │ jenkins │ v1.37.0 │ 02 Dec 25 18:48 UTC │                     │
	└─────────┴──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┴──────────────────────┴─────────┴─────────┴─────────────────────┴─────────────────────┘
	
	
	==> Last Start <==
	Log file created at: 2025/12/02 18:48:06
	Running on machine: ip-172-31-31-251
	Binary: Built with gc go1.25.3 for linux/arm64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I1202 18:48:06.555657    4847 out.go:360] Setting OutFile to fd 1 ...
	I1202 18:48:06.556085    4847 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:48:06.556119    4847 out.go:374] Setting ErrFile to fd 2...
	I1202 18:48:06.556138    4847 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:48:06.556490    4847 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 18:48:06.556958    4847 out.go:368] Setting JSON to true
	I1202 18:48:06.557716    4847 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":1823,"bootTime":1764699464,"procs":143,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 18:48:06.557808    4847 start.go:143] virtualization:  
	I1202 18:48:06.561073    4847 out.go:99] [download-only-178568] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 18:48:06.561447    4847 notify.go:221] Checking for updates...
	I1202 18:48:06.564240    4847 out.go:171] MINIKUBE_LOCATION=22021
	I1202 18:48:06.567365    4847 out.go:171] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 18:48:06.570356    4847 out.go:171] KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 18:48:06.573292    4847 out.go:171] MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 18:48:06.576530    4847 out.go:171] MINIKUBE_BIN=out/minikube-linux-arm64
	W1202 18:48:06.582396    4847 out.go:336] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I1202 18:48:06.582720    4847 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 18:48:06.604447    4847 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 18:48:06.604561    4847 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 18:48:06.685663    4847 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-02 18:48:06.675871874 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 18:48:06.685774    4847 docker.go:319] overlay module found
	I1202 18:48:06.688814    4847 out.go:99] Using the docker driver based on user configuration
	I1202 18:48:06.688854    4847 start.go:309] selected driver: docker
	I1202 18:48:06.688862    4847 start.go:927] validating driver "docker" against <nil>
	I1202 18:48:06.688975    4847 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 18:48:06.746636    4847 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:0 ContainersRunning:0 ContainersPaused:0 ContainersStopped:0 Images:1 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:29 OomKillDisable:true NGoroutines:47 SystemTime:2025-12-02 18:48:06.737644892 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 18:48:06.746800    4847 start_flags.go:327] no existing cluster config was found, will generate one from the flags 
	I1202 18:48:06.747081    4847 start_flags.go:410] Using suggested 3072MB memory alloc based on sys=7834MB, container=7834MB
	I1202 18:48:06.747236    4847 start_flags.go:974] Wait components to verify : map[apiserver:true system_pods:true]
	I1202 18:48:06.750411    4847 out.go:171] Using Docker driver with root privileges
	
	
	* The control-plane node download-only-178568 host does not exist
	  To start a cluster, run: "minikube start -p download-only-178568"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:184: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.35.0-beta.0/LogsDuration (0.08s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/DeleteAll (0.23s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/DeleteAll
aaa_download_only_test.go:196: (dbg) Run:  out/minikube-linux-arm64 delete --all
--- PASS: TestDownloadOnly/v1.35.0-beta.0/DeleteAll (0.23s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds (0.13s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:207: (dbg) Run:  out/minikube-linux-arm64 delete -p download-only-178568
--- PASS: TestDownloadOnly/v1.35.0-beta.0/DeleteAlwaysSucceeds (0.13s)

                                                
                                    
x
+
TestBinaryMirror (0.61s)

                                                
                                                
=== RUN   TestBinaryMirror
I1202 18:48:10.431415    4435 binary.go:80] Not caching binary, using https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubectl?checksum=file:https://dl.k8s.io/release/v1.34.2/bin/linux/arm64/kubectl.sha256
aaa_download_only_test.go:309: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p binary-mirror-945214 --alsologtostderr --binary-mirror http://127.0.0.1:42443 --driver=docker  --container-runtime=containerd
helpers_test.go:175: Cleaning up "binary-mirror-945214" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p binary-mirror-945214
--- PASS: TestBinaryMirror (0.61s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.08s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:1000: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-932514
addons_test.go:1000: (dbg) Non-zero exit: out/minikube-linux-arm64 addons enable dashboard -p addons-932514: exit status 85 (84.256469ms)

                                                
                                                
-- stdout --
	* Profile "addons-932514" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-932514"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.08s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.07s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:1011: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-932514
addons_test.go:1011: (dbg) Non-zero exit: out/minikube-linux-arm64 addons disable dashboard -p addons-932514: exit status 85 (71.498742ms)

                                                
                                                
-- stdout --
	* Profile "addons-932514" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-932514"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.07s)

                                                
                                    
x
+
TestAddons/Setup (154.91s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:108: (dbg) Run:  out/minikube-linux-arm64 start -p addons-932514 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher
addons_test.go:108: (dbg) Done: out/minikube-linux-arm64 start -p addons-932514 --wait=true --memory=4096 --alsologtostderr --addons=registry --addons=registry-creds --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=nvidia-device-plugin --addons=yakd --addons=volcano --addons=amd-gpu-device-plugin --driver=docker  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=storage-provisioner-rancher: (2m34.908765112s)
--- PASS: TestAddons/Setup (154.91s)

                                                
                                    
x
+
TestAddons/serial/Volcano (40.75s)

                                                
                                                
=== RUN   TestAddons/serial/Volcano
addons_test.go:884: volcano-controller stabilized in 72.499522ms
addons_test.go:876: volcano-admission stabilized in 73.091162ms
addons_test.go:868: volcano-scheduler stabilized in 73.549878ms
addons_test.go:890: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-scheduler" in namespace "volcano-system" ...
helpers_test.go:352: "volcano-scheduler-76c996c8bf-hfnpb" [ddded7d8-4e2f-4704-a2cc-a7780cecbe76] Running
addons_test.go:890: (dbg) TestAddons/serial/Volcano: app=volcano-scheduler healthy within 6.003470417s
addons_test.go:894: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-admission" in namespace "volcano-system" ...
helpers_test.go:352: "volcano-admission-6c447bd768-r7gkq" [4d795422-bd37-40fb-a196-e087e7abf043] Running
addons_test.go:894: (dbg) TestAddons/serial/Volcano: app=volcano-admission healthy within 5.00408535s
addons_test.go:898: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-controller" in namespace "volcano-system" ...
helpers_test.go:352: "volcano-controllers-6fd4f85cb8-vbmnl" [5efb5a18-dc9a-435b-a8b4-88750e4e06b1] Running
addons_test.go:898: (dbg) TestAddons/serial/Volcano: app=volcano-controller healthy within 5.004356391s
addons_test.go:903: (dbg) Run:  kubectl --context addons-932514 delete -n volcano-system job volcano-admission-init
addons_test.go:909: (dbg) Run:  kubectl --context addons-932514 create -f testdata/vcjob.yaml
addons_test.go:917: (dbg) Run:  kubectl --context addons-932514 get vcjob -n my-volcano
addons_test.go:935: (dbg) TestAddons/serial/Volcano: waiting 3m0s for pods matching "volcano.sh/job-name=test-job" in namespace "my-volcano" ...
helpers_test.go:352: "test-job-nginx-0" [cc7abf08-5138-4f83-81a0-ee6e10383674] Pending
helpers_test.go:352: "test-job-nginx-0" [cc7abf08-5138-4f83-81a0-ee6e10383674] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:352: "test-job-nginx-0" [cc7abf08-5138-4f83-81a0-ee6e10383674] Running
addons_test.go:935: (dbg) TestAddons/serial/Volcano: volcano.sh/job-name=test-job healthy within 12.003601359s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable volcano --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-932514 addons disable volcano --alsologtostderr -v=1: (12.044116805s)
--- PASS: TestAddons/serial/Volcano (40.75s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.19s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:630: (dbg) Run:  kubectl --context addons-932514 create ns new-namespace
addons_test.go:644: (dbg) Run:  kubectl --context addons-932514 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.19s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/FakeCredentials (8.86s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/FakeCredentials
addons_test.go:675: (dbg) Run:  kubectl --context addons-932514 create -f testdata/busybox.yaml
addons_test.go:682: (dbg) Run:  kubectl --context addons-932514 create sa gcp-auth-test
addons_test.go:688: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:352: "busybox" [feaacefb-dbef-4dbe-8fb4-59ea754d5dea] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:352: "busybox" [feaacefb-dbef-4dbe-8fb4-59ea754d5dea] Running
addons_test.go:688: (dbg) TestAddons/serial/GCPAuth/FakeCredentials: integration-test=busybox healthy within 8.00307158s
addons_test.go:694: (dbg) Run:  kubectl --context addons-932514 exec busybox -- /bin/sh -c "printenv GOOGLE_APPLICATION_CREDENTIALS"
addons_test.go:706: (dbg) Run:  kubectl --context addons-932514 describe sa gcp-auth-test
addons_test.go:720: (dbg) Run:  kubectl --context addons-932514 exec busybox -- /bin/sh -c "cat /google-app-creds.json"
addons_test.go:744: (dbg) Run:  kubectl --context addons-932514 exec busybox -- /bin/sh -c "printenv GOOGLE_CLOUD_PROJECT"
--- PASS: TestAddons/serial/GCPAuth/FakeCredentials (8.86s)

                                                
                                    
x
+
TestAddons/parallel/Registry (17.45s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:382: registry stabilized in 5.175346ms
addons_test.go:384: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:352: "registry-6b586f9694-w72gx" [59aacc7e-f443-4a4b-8a4f-03c58b416d16] Running
addons_test.go:384: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 6.00334047s
addons_test.go:387: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:352: "registry-proxy-z5jl6" [bcce8a7d-ee25-4575-af9a-76dcb1010c70] Running
addons_test.go:387: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.003361444s
addons_test.go:392: (dbg) Run:  kubectl --context addons-932514 delete po -l run=registry-test --now
addons_test.go:397: (dbg) Run:  kubectl --context addons-932514 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:397: (dbg) Done: kubectl --context addons-932514 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (5.024359782s)
addons_test.go:411: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 ip
2025/12/02 18:52:01 [DEBUG] GET http://192.168.49.2:5000
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable registry --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-932514 addons disable registry --alsologtostderr -v=1: (1.109518998s)
--- PASS: TestAddons/parallel/Registry (17.45s)

                                                
                                    
x
+
TestAddons/parallel/RegistryCreds (0.74s)

                                                
                                                
=== RUN   TestAddons/parallel/RegistryCreds
=== PAUSE TestAddons/parallel/RegistryCreds

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/RegistryCreds
addons_test.go:323: registry-creds stabilized in 3.503781ms
addons_test.go:325: (dbg) Run:  out/minikube-linux-arm64 addons configure registry-creds -f ./testdata/addons_testconfig.json -p addons-932514
addons_test.go:332: (dbg) Run:  kubectl --context addons-932514 -n kube-system get secret -o yaml
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable registry-creds --alsologtostderr -v=1
--- PASS: TestAddons/parallel/RegistryCreds (0.74s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (19.94s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:209: (dbg) Run:  kubectl --context addons-932514 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:234: (dbg) Run:  kubectl --context addons-932514 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:247: (dbg) Run:  kubectl --context addons-932514 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:352: "nginx" [8d8786d2-aebc-4f64-b115-e1227e71a38d] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:352: "nginx" [8d8786d2-aebc-4f64-b115-e1227e71a38d] Running
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 9.005348864s
I1202 18:52:30.066416    4435 kapi.go:150] Service nginx in namespace default found.
addons_test.go:264: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:288: (dbg) Run:  kubectl --context addons-932514 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:293: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 ip
addons_test.go:299: (dbg) Run:  nslookup hello-john.test 192.168.49.2
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-932514 addons disable ingress-dns --alsologtostderr -v=1: (1.338947049s)
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable ingress --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-932514 addons disable ingress --alsologtostderr -v=1: (7.829386749s)
--- PASS: TestAddons/parallel/Ingress (19.94s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (10.92s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:823: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:352: "gadget-82wcl" [09a2fbd6-5aca-4349-ada7-ba6d0e6ec3c0] Running
addons_test.go:823: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 5.02289867s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable inspektor-gadget --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-932514 addons disable inspektor-gadget --alsologtostderr -v=1: (5.893857165s)
--- PASS: TestAddons/parallel/InspektorGadget (10.92s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (6.85s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:455: metrics-server stabilized in 12.848163ms
addons_test.go:457: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:352: "metrics-server-85b7d694d7-dqdzh" [a405a43b-7a12-4c3f-9ff2-1747fc74cc0a] Running
addons_test.go:457: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 6.003553706s
addons_test.go:463: (dbg) Run:  kubectl --context addons-932514 top pods -n kube-system
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (6.85s)

                                                
                                    
x
+
TestAddons/parallel/CSI (31.29s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
I1202 18:52:02.386306    4435 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
I1202 18:52:02.414711    4435 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
I1202 18:52:02.414741    4435 kapi.go:107] duration metric: took 31.732344ms to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
addons_test.go:549: csi-hostpath-driver pods stabilized in 31.743395ms
addons_test.go:552: (dbg) Run:  kubectl --context addons-932514 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:557: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:402: (dbg) Run:  kubectl --context addons-932514 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-932514 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:562: (dbg) Run:  kubectl --context addons-932514 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:567: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:352: "task-pv-pod" [23e5e32e-1fe7-47f5-b193-7a4eb72e7d00] Pending
helpers_test.go:352: "task-pv-pod" [23e5e32e-1fe7-47f5-b193-7a4eb72e7d00] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:352: "task-pv-pod" [23e5e32e-1fe7-47f5-b193-7a4eb72e7d00] Running
addons_test.go:567: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 8.003534817s
addons_test.go:572: (dbg) Run:  kubectl --context addons-932514 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:577: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:427: (dbg) Run:  kubectl --context addons-932514 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:427: (dbg) Run:  kubectl --context addons-932514 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:582: (dbg) Run:  kubectl --context addons-932514 delete pod task-pv-pod
addons_test.go:582: (dbg) Done: kubectl --context addons-932514 delete pod task-pv-pod: (1.180306676s)
addons_test.go:588: (dbg) Run:  kubectl --context addons-932514 delete pvc hpvc
addons_test.go:594: (dbg) Run:  kubectl --context addons-932514 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:599: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:402: (dbg) Run:  kubectl --context addons-932514 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-932514 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-932514 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:604: (dbg) Run:  kubectl --context addons-932514 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:609: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:352: "task-pv-pod-restore" [37031c3b-47e9-4363-b69b-d7140b4d1f21] Pending
helpers_test.go:352: "task-pv-pod-restore" [37031c3b-47e9-4363-b69b-d7140b4d1f21] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:352: "task-pv-pod-restore" [37031c3b-47e9-4363-b69b-d7140b4d1f21] Running
addons_test.go:609: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 7.005703491s
addons_test.go:614: (dbg) Run:  kubectl --context addons-932514 delete pod task-pv-pod-restore
addons_test.go:614: (dbg) Done: kubectl --context addons-932514 delete pod task-pv-pod-restore: (1.377201546s)
addons_test.go:618: (dbg) Run:  kubectl --context addons-932514 delete pvc hpvc-restore
addons_test.go:622: (dbg) Run:  kubectl --context addons-932514 delete volumesnapshot new-snapshot-demo
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable volumesnapshots --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-932514 addons disable volumesnapshots --alsologtostderr -v=1: (1.055378371s)
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-932514 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.848539106s)
--- PASS: TestAddons/parallel/CSI (31.29s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (17.19s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:808: (dbg) Run:  out/minikube-linux-arm64 addons enable headlamp -p addons-932514 --alsologtostderr -v=1
addons_test.go:808: (dbg) Done: out/minikube-linux-arm64 addons enable headlamp -p addons-932514 --alsologtostderr -v=1: (1.061239807s)
addons_test.go:813: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:352: "headlamp-dfcdc64b-wnd4m" [61ba6336-44aa-401d-a370-f4ab84f1befc] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:352: "headlamp-dfcdc64b-wnd4m" [61ba6336-44aa-401d-a370-f4ab84f1befc] Running
addons_test.go:813: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 10.003563058s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable headlamp --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-932514 addons disable headlamp --alsologtostderr -v=1: (6.119847872s)
--- PASS: TestAddons/parallel/Headlamp (17.19s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.59s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:840: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:352: "cloud-spanner-emulator-5bdddb765-r59j5" [dcad38c0-5f50-4e99-ab4f-1c95213b1d22] Running
addons_test.go:840: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.003839941s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable cloud-spanner --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CloudSpanner (5.59s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (51.08s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:949: (dbg) Run:  kubectl --context addons-932514 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:955: (dbg) Run:  kubectl --context addons-932514 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:959: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:402: (dbg) Run:  kubectl --context addons-932514 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-932514 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-932514 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-932514 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:402: (dbg) Run:  kubectl --context addons-932514 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:962: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:352: "test-local-path" [119424ca-cb59-4e9b-9eda-7b33fa0e7758] Pending
helpers_test.go:352: "test-local-path" [119424ca-cb59-4e9b-9eda-7b33fa0e7758] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:352: "test-local-path" [119424ca-cb59-4e9b-9eda-7b33fa0e7758] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:962: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 3.002922602s
addons_test.go:967: (dbg) Run:  kubectl --context addons-932514 get pvc test-pvc -o=json
addons_test.go:976: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 ssh "cat /opt/local-path-provisioner/pvc-e5f08d05-d080-46c4-8682-02b4e526b4ec_default_test-pvc/file1"
addons_test.go:988: (dbg) Run:  kubectl --context addons-932514 delete pod test-local-path
addons_test.go:992: (dbg) Run:  kubectl --context addons-932514 delete pvc test-pvc
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-932514 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (42.958493279s)
--- PASS: TestAddons/parallel/LocalPath (51.08s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.53s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:1025: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:352: "nvidia-device-plugin-daemonset-f248n" [e6bbf29d-7425-4641-85d1-863d6293a959] Running
addons_test.go:1025: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.003950048s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable nvidia-device-plugin --alsologtostderr -v=1
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (5.53s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (11.83s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:1047: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:352: "yakd-dashboard-5ff678cb9-s5x29" [c4d674cb-983d-4134-99f1-e515f939db43] Running
addons_test.go:1047: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 6.005380792s
addons_test.go:1053: (dbg) Run:  out/minikube-linux-arm64 -p addons-932514 addons disable yakd --alsologtostderr -v=1
addons_test.go:1053: (dbg) Done: out/minikube-linux-arm64 -p addons-932514 addons disable yakd --alsologtostderr -v=1: (5.823088548s)
--- PASS: TestAddons/parallel/Yakd (11.83s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (12.35s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:172: (dbg) Run:  out/minikube-linux-arm64 stop -p addons-932514
addons_test.go:172: (dbg) Done: out/minikube-linux-arm64 stop -p addons-932514: (12.070888093s)
addons_test.go:176: (dbg) Run:  out/minikube-linux-arm64 addons enable dashboard -p addons-932514
addons_test.go:180: (dbg) Run:  out/minikube-linux-arm64 addons disable dashboard -p addons-932514
addons_test.go:185: (dbg) Run:  out/minikube-linux-arm64 addons disable gvisor -p addons-932514
--- PASS: TestAddons/StoppedEnableDisable (12.35s)

                                                
                                    
x
+
TestCertOptions (37.44s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-linux-arm64 start -p cert-options-504298 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd
cert_options_test.go:49: (dbg) Done: out/minikube-linux-arm64 start -p cert-options-504298 --memory=3072 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=docker  --container-runtime=containerd: (34.530694707s)
cert_options_test.go:60: (dbg) Run:  out/minikube-linux-arm64 -p cert-options-504298 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-504298 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-linux-arm64 ssh -p cert-options-504298 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-504298" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-options-504298
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p cert-options-504298: (2.147076838s)
--- PASS: TestCertOptions (37.44s)

                                                
                                    
x
+
TestCertExpiration (233.32s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-142441 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd
cert_options_test.go:123: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-142441 --memory=3072 --cert-expiration=3m --driver=docker  --container-runtime=containerd: (41.776910716s)
cert_options_test.go:131: (dbg) Run:  out/minikube-linux-arm64 start -p cert-expiration-142441 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd
cert_options_test.go:131: (dbg) Done: out/minikube-linux-arm64 start -p cert-expiration-142441 --memory=3072 --cert-expiration=8760h --driver=docker  --container-runtime=containerd: (8.453810865s)
helpers_test.go:175: Cleaning up "cert-expiration-142441" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p cert-expiration-142441
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p cert-expiration-142441: (3.089463654s)
--- PASS: TestCertExpiration (233.32s)

                                                
                                    
x
+
TestForceSystemdFlag (42.58s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-flag-818934 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
docker_test.go:91: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-flag-818934 --memory=3072 --force-systemd --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (39.978616329s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-flag-818934 ssh "cat /etc/containerd/config.toml"
helpers_test.go:175: Cleaning up "force-systemd-flag-818934" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-flag-818934
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-flag-818934: (2.231372768s)
--- PASS: TestForceSystemdFlag (42.58s)

                                                
                                    
x
+
TestForceSystemdEnv (44.96s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-linux-arm64 start -p force-systemd-env-964316 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
E1202 20:13:00.704388    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
docker_test.go:155: (dbg) Done: out/minikube-linux-arm64 start -p force-systemd-env-964316 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (41.757210574s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-arm64 -p force-systemd-env-964316 ssh "cat /etc/containerd/config.toml"
helpers_test.go:175: Cleaning up "force-systemd-env-964316" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p force-systemd-env-964316
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p force-systemd-env-964316: (2.634128198s)
--- PASS: TestForceSystemdEnv (44.96s)

                                                
                                    
x
+
TestDockerEnvContainerd (48.4s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with containerd true linux arm64
docker_test.go:181: (dbg) Run:  out/minikube-linux-arm64 start -p dockerenv-003477 --driver=docker  --container-runtime=containerd
docker_test.go:181: (dbg) Done: out/minikube-linux-arm64 start -p dockerenv-003477 --driver=docker  --container-runtime=containerd: (32.608525908s)
docker_test.go:189: (dbg) Run:  /bin/bash -c "out/minikube-linux-arm64 docker-env --ssh-host --ssh-add -p dockerenv-003477"
docker_test.go:189: (dbg) Done: /bin/bash -c "out/minikube-linux-arm64 docker-env --ssh-host --ssh-add -p dockerenv-003477": (1.143892164s)
docker_test.go:220: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-XAxW7bmysQez/agent.23784" SSH_AGENT_PID="23785" DOCKER_HOST=ssh://docker@127.0.0.1:32773 docker version"
docker_test.go:243: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-XAxW7bmysQez/agent.23784" SSH_AGENT_PID="23785" DOCKER_HOST=ssh://docker@127.0.0.1:32773 DOCKER_BUILDKIT=0 docker build -t local/minikube-dockerenv-containerd-test:latest testdata/docker-env"
docker_test.go:243: (dbg) Done: /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-XAxW7bmysQez/agent.23784" SSH_AGENT_PID="23785" DOCKER_HOST=ssh://docker@127.0.0.1:32773 DOCKER_BUILDKIT=0 docker build -t local/minikube-dockerenv-containerd-test:latest testdata/docker-env": (1.068170855s)
docker_test.go:250: (dbg) Run:  /bin/bash -c "SSH_AUTH_SOCK="/tmp/ssh-XAxW7bmysQez/agent.23784" SSH_AGENT_PID="23785" DOCKER_HOST=ssh://docker@127.0.0.1:32773 docker image ls"
helpers_test.go:175: Cleaning up "dockerenv-003477" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p dockerenv-003477
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p dockerenv-003477: (2.082385886s)
--- PASS: TestDockerEnvContainerd (48.40s)

                                                
                                    
x
+
TestErrorSpam/setup (33.25s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-linux-arm64 start -p nospam-057145 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-057145 --driver=docker  --container-runtime=containerd
error_spam_test.go:81: (dbg) Done: out/minikube-linux-arm64 start -p nospam-057145 -n=1 --memory=3072 --wait=false --log_dir=/tmp/nospam-057145 --driver=docker  --container-runtime=containerd: (33.245925123s)
--- PASS: TestErrorSpam/setup (33.25s)

                                                
                                    
x
+
TestErrorSpam/start (0.8s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:206: Cleaning up 1 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 start --dry-run
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 start --dry-run
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 start --dry-run
--- PASS: TestErrorSpam/start (0.80s)

                                                
                                    
x
+
TestErrorSpam/status (1.2s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 status
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 status
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 status
--- PASS: TestErrorSpam/status (1.20s)

                                                
                                    
x
+
TestErrorSpam/pause (1.84s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 pause
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 pause
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 pause
--- PASS: TestErrorSpam/pause (1.84s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.98s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 unpause
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 unpause
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 unpause
--- PASS: TestErrorSpam/unpause (1.98s)

                                                
                                    
x
+
TestErrorSpam/stop (1.58s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:206: Cleaning up 0 logfile(s) ...
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 stop
error_spam_test.go:149: (dbg) Done: out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 stop: (1.371922782s)
error_spam_test.go:149: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 stop
error_spam_test.go:172: (dbg) Run:  out/minikube-linux-arm64 -p nospam-057145 --log_dir /tmp/nospam-057145 stop
--- PASS: TestErrorSpam/stop (1.58s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (76.63s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2239: (dbg) Run:  out/minikube-linux-arm64 start -p functional-224594 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd
E1202 18:55:46.071649    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:46.077989    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:46.089280    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:46.110602    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:46.151923    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:46.233269    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:46.394693    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:46.716291    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:47.358260    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:48.639549    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:51.201217    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:55:56.322548    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:56:06.563915    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 18:56:27.045320    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:2239: (dbg) Done: out/minikube-linux-arm64 start -p functional-224594 --memory=4096 --apiserver-port=8441 --wait=all --driver=docker  --container-runtime=containerd: (1m16.621440365s)
--- PASS: TestFunctional/serial/StartWithProxy (76.63s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (7.46s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
I1202 18:56:41.414592    4435 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
functional_test.go:674: (dbg) Run:  out/minikube-linux-arm64 start -p functional-224594 --alsologtostderr -v=8
functional_test.go:674: (dbg) Done: out/minikube-linux-arm64 start -p functional-224594 --alsologtostderr -v=8: (7.459790359s)
functional_test.go:678: soft start took 7.461901121s for "functional-224594" cluster.
I1202 18:56:48.875359    4435 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
--- PASS: TestFunctional/serial/SoftStart (7.46s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.07s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.13s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:711: (dbg) Run:  kubectl --context functional-224594 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.13s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.65s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-224594 cache add registry.k8s.io/pause:3.1: (1.320775987s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-224594 cache add registry.k8s.io/pause:3.3: (1.274994723s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-224594 cache add registry.k8s.io/pause:latest: (1.054537837s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.65s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.33s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-224594 /tmp/TestFunctionalserialCacheCmdcacheadd_local1828660971/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 cache add minikube-local-cache-test:functional-224594
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 cache delete minikube-local-cache-test:functional-224594
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-224594
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.33s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.31s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.31s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.89s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-224594 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (316.150073ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.89s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.13s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.13s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.14s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:731: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 kubectl -- --context functional-224594 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.14s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.15s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:756: (dbg) Run:  out/kubectl --context functional-224594 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.15s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (53.93s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:772: (dbg) Run:  out/minikube-linux-arm64 start -p functional-224594 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E1202 18:57:08.006829    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:772: (dbg) Done: out/minikube-linux-arm64 start -p functional-224594 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (53.924569628s)
functional_test.go:776: restart took 53.924688963s for "functional-224594" cluster.
I1202 18:57:50.723903    4435 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
--- PASS: TestFunctional/serial/ExtraConfig (53.93s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.1s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:825: (dbg) Run:  kubectl --context functional-224594 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:840: etcd phase: Running
functional_test.go:850: etcd status: Ready
functional_test.go:840: kube-apiserver phase: Running
functional_test.go:850: kube-apiserver status: Ready
functional_test.go:840: kube-controller-manager phase: Running
functional_test.go:850: kube-controller-manager status: Ready
functional_test.go:840: kube-scheduler phase: Running
functional_test.go:850: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.10s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (1.49s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 logs
functional_test.go:1251: (dbg) Done: out/minikube-linux-arm64 -p functional-224594 logs: (1.486602087s)
--- PASS: TestFunctional/serial/LogsCmd (1.49s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (1.48s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 logs --file /tmp/TestFunctionalserialLogsFileCmd4172582059/001/logs.txt
functional_test.go:1265: (dbg) Done: out/minikube-linux-arm64 -p functional-224594 logs --file /tmp/TestFunctionalserialLogsFileCmd4172582059/001/logs.txt: (1.479588964s)
--- PASS: TestFunctional/serial/LogsFileCmd (1.48s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.93s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2326: (dbg) Run:  kubectl --context functional-224594 apply -f testdata/invalidsvc.yaml
functional_test.go:2340: (dbg) Run:  out/minikube-linux-arm64 service invalid-svc -p functional-224594
functional_test.go:2340: (dbg) Non-zero exit: out/minikube-linux-arm64 service invalid-svc -p functional-224594: exit status 115 (978.768971ms)

                                                
                                                
-- stdout --
	┌───────────┬─────────────┬─────────────┬───────────────────────────┐
	│ NAMESPACE │    NAME     │ TARGET PORT │            URL            │
	├───────────┼─────────────┼─────────────┼───────────────────────────┤
	│ default   │ invalid-svc │ 80          │ http://192.168.49.2:31729 │
	└───────────┴─────────────┴─────────────┴───────────────────────────┘
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log                 │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2332: (dbg) Run:  kubectl --context functional-224594 delete -f testdata/invalidsvc.yaml
--- PASS: TestFunctional/serial/InvalidService (4.93s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.45s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-224594 config get cpus: exit status 14 (66.163403ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-224594 config get cpus: exit status 14 (74.746465ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.45s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (6.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:920: (dbg) daemon: [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-224594 --alsologtostderr -v=1]
functional_test.go:925: (dbg) stopping [out/minikube-linux-arm64 dashboard --url --port 36195 -p functional-224594 --alsologtostderr -v=1] ...
helpers_test.go:525: unable to kill pid 39204: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (6.26s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.56s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-224594 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd
E1202 18:58:29.928716    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-224594 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd: exit status 23 (277.273429ms)

                                                
                                                
-- stdout --
	* [functional-224594] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22021
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1202 18:58:29.966412   38925 out.go:360] Setting OutFile to fd 1 ...
	I1202 18:58:29.966546   38925 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:58:29.966581   38925 out.go:374] Setting ErrFile to fd 2...
	I1202 18:58:29.966594   38925 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:58:29.966843   38925 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 18:58:29.967213   38925 out.go:368] Setting JSON to false
	I1202 18:58:29.968130   38925 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":2446,"bootTime":1764699464,"procs":204,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 18:58:29.968202   38925 start.go:143] virtualization:  
	I1202 18:58:29.973371   38925 out.go:179] * [functional-224594] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 18:58:29.987889   38925 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 18:58:29.987939   38925 notify.go:221] Checking for updates...
	I1202 18:58:29.997821   38925 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 18:58:30.017927   38925 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 18:58:30.046960   38925 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 18:58:30.049978   38925 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 18:58:30.054195   38925 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 18:58:30.057935   38925 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 18:58:30.060173   38925 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 18:58:30.108732   38925 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 18:58:30.108881   38925 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 18:58:30.173011   38925 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:53 SystemTime:2025-12-02 18:58:30.162305903 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 18:58:30.173123   38925 docker.go:319] overlay module found
	I1202 18:58:30.176284   38925 out.go:179] * Using the docker driver based on existing profile
	I1202 18:58:30.179296   38925 start.go:309] selected driver: docker
	I1202 18:58:30.179322   38925 start.go:927] validating driver "docker" against &{Name:functional-224594 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-224594 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 18:58:30.179441   38925 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 18:58:30.182990   38925 out.go:203] 
	W1202 18:58:30.186007   38925 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1202 18:58:30.188914   38925 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-224594 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
--- PASS: TestFunctional/parallel/DryRun (0.56s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-224594 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-224594 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd: exit status 23 (228.362442ms)

                                                
                                                
-- stdout --
	* [functional-224594] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22021
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1202 18:58:29.764672   38870 out.go:360] Setting OutFile to fd 1 ...
	I1202 18:58:29.764800   38870 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:58:29.764812   38870 out.go:374] Setting ErrFile to fd 2...
	I1202 18:58:29.764817   38870 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 18:58:29.765495   38870 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 18:58:29.765904   38870 out.go:368] Setting JSON to false
	I1202 18:58:29.766804   38870 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":2446,"bootTime":1764699464,"procs":204,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 18:58:29.766876   38870 start.go:143] virtualization:  
	I1202 18:58:29.771098   38870 out.go:179] * [functional-224594] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1202 18:58:29.774989   38870 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 18:58:29.775112   38870 notify.go:221] Checking for updates...
	I1202 18:58:29.787640   38870 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 18:58:29.791941   38870 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 18:58:29.794878   38870 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 18:58:29.797921   38870 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 18:58:29.800808   38870 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 18:58:29.804262   38870 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 18:58:29.805022   38870 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 18:58:29.830083   38870 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 18:58:29.830196   38870 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 18:58:29.894971   38870 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:2 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:53 SystemTime:2025-12-02 18:58:29.884205453 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 18:58:29.895085   38870 docker.go:319] overlay module found
	I1202 18:58:29.898356   38870 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1202 18:58:29.901231   38870 start.go:309] selected driver: docker
	I1202 18:58:29.901250   38870 start.go:927] validating driver "docker" against &{Name:functional-224594 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.34.2 ClusterName:functional-224594 Namespace:default APIServerHAVIP: APIServerNa
me:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.34.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOpt
ions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 18:58:29.901359   38870 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 18:58:29.904967   38870 out.go:203] 
	W1202 18:58:29.907801   38870 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1202 18:58:29.910619   38870 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.23s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (1.34s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:869: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 status
functional_test.go:875: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:887: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (1.34s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (8.76s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1636: (dbg) Run:  kubectl --context functional-224594 create deployment hello-node-connect --image kicbase/echo-server
functional_test.go:1640: (dbg) Run:  kubectl --context functional-224594 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:352: "hello-node-connect-7d85dfc575-llfzs" [3b88528d-c8a3-4595-820c-0cd4362cc06f] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:352: "hello-node-connect-7d85dfc575-llfzs" [3b88528d-c8a3-4595-820c-0cd4362cc06f] Running
functional_test.go:1645: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 8.003302659s
functional_test.go:1654: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 service hello-node-connect --url
functional_test.go:1660: found endpoint for hello-node-connect: http://192.168.49.2:30156
functional_test.go:1680: http://192.168.49.2:30156: success! body:
Request served by hello-node-connect-7d85dfc575-llfzs

                                                
                                                
HTTP/1.1 GET /

                                                
                                                
Host: 192.168.49.2:30156
Accept-Encoding: gzip
User-Agent: Go-http-client/1.1
--- PASS: TestFunctional/parallel/ServiceCmdConnect (8.76s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (23.88s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:352: "storage-provisioner" [718b6859-0fd4-47fb-99d5-e103ef2879fd] Running
functional_test_pvc_test.go:50: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 6.00307304s
functional_test_pvc_test.go:55: (dbg) Run:  kubectl --context functional-224594 get storageclass -o=json
functional_test_pvc_test.go:75: (dbg) Run:  kubectl --context functional-224594 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:82: (dbg) Run:  kubectl --context functional-224594 get pvc myclaim -o=json
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-224594 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:352: "sp-pod" [083a052b-1992-4a50-94cc-cffb681c50a8] Pending
helpers_test.go:352: "sp-pod" [083a052b-1992-4a50-94cc-cffb681c50a8] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:352: "sp-pod" [083a052b-1992-4a50-94cc-cffb681c50a8] Running
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 9.003736243s
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-224594 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:112: (dbg) Run:  kubectl --context functional-224594 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:131: (dbg) Run:  kubectl --context functional-224594 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:352: "sp-pod" [fc28e70f-b2ee-4311-a97b-3d0e9fac7203] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:352: "sp-pod" [fc28e70f-b2ee-4311-a97b-3d0e9fac7203] Running
functional_test_pvc_test.go:140: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 7.003633803s
functional_test_pvc_test.go:120: (dbg) Run:  kubectl --context functional-224594 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (23.88s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.73s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.73s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (2.56s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh -n functional-224594 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 cp functional-224594:/home/docker/cp-test.txt /tmp/TestFunctionalparallelCpCmd3430410996/001/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh -n functional-224594 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh -n functional-224594 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (2.56s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.34s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/4435/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo cat /etc/test/nested/copy/4435/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.34s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (2.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/4435.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo cat /etc/ssl/certs/4435.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/4435.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo cat /usr/share/ca-certificates/4435.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/44352.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo cat /etc/ssl/certs/44352.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/44352.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo cat /usr/share/ca-certificates/44352.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (2.27s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:234: (dbg) Run:  kubectl --context functional-224594 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.12s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.75s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-224594 ssh "sudo systemctl is-active docker": exit status 1 (341.992968ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo systemctl is-active crio"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-224594 ssh "sudo systemctl is-active crio": exit status 1 (408.570222ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.75s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.42s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctional/parallel/License (0.42s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.8s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-224594 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-linux-arm64 -p functional-224594 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-224594 tunnel --alsologtostderr] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-linux-arm64 -p functional-224594 tunnel --alsologtostderr] ...
helpers_test.go:525: unable to kill pid 36240: os: process already finished
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.80s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-224594 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (8.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-224594 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:352: "nginx-svc" [a164ef69-632b-40ec-abb5-3c02751c3850] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:352: "nginx-svc" [a164ef69-632b-40ec-abb5-3c02751c3850] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 8.003452995s
I1202 18:58:09.170684    4435 kapi.go:150] Service nginx-svc in namespace default found.
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (8.47s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-224594 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.12s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.101.95.205 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-224594 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: signal: terminated
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.11s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (7.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1451: (dbg) Run:  kubectl --context functional-224594 create deployment hello-node --image kicbase/echo-server
functional_test.go:1455: (dbg) Run:  kubectl --context functional-224594 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:352: "hello-node-75c85bcc94-kb568" [d84740f2-f549-47c8-9d0e-71e00b9117b8] Pending / Ready:ContainersNotReady (containers with unready status: [echo-server]) / ContainersReady:ContainersNotReady (containers with unready status: [echo-server])
helpers_test.go:352: "hello-node-75c85bcc94-kb568" [d84740f2-f549-47c8-9d0e-71e00b9117b8] Running
functional_test.go:1460: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 7.004728507s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (7.25s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.47s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.63s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1469: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.63s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.55s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "498.970531ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "52.486694ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.55s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "480.240911ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "55.426929ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.54s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.63s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1499: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 service list -o json
functional_test.go:1504: Took "629.61252ms" to run "out/minikube-linux-arm64 -p functional-224594 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.63s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (8.8s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdany-port1866203534/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1764701906728990355" to /tmp/TestFunctionalparallelMountCmdany-port1866203534/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1764701906728990355" to /tmp/TestFunctionalparallelMountCmdany-port1866203534/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1764701906728990355" to /tmp/TestFunctionalparallelMountCmdany-port1866203534/001/test-1764701906728990355
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (421.929082ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1202 18:58:27.152625    4435 retry.go:31] will retry after 632.526544ms: exit status 1
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Dec  2 18:58 created-by-test
-rw-r--r-- 1 docker docker 24 Dec  2 18:58 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Dec  2 18:58 test-1764701906728990355
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh cat /mount-9p/test-1764701906728990355
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-224594 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:352: "busybox-mount" [e0bdab93-d617-4ef2-a222-1c2e1d9dfaf6] Pending
helpers_test.go:352: "busybox-mount" [e0bdab93-d617-4ef2-a222-1c2e1d9dfaf6] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:352: "busybox-mount" [e0bdab93-d617-4ef2-a222-1c2e1d9dfaf6] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:352: "busybox-mount" [e0bdab93-d617-4ef2-a222-1c2e1d9dfaf6] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 5.003618952s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-224594 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdany-port1866203534/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (8.80s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.59s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1519: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 service --namespace=default --https --url hello-node
functional_test.go:1532: found endpoint: https://192.168.49.2:32257
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.59s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1550: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.41s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1569: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 service hello-node --url
functional_test.go:1575: found endpoint for hello-node: http://192.168.49.2:32257
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (2.33s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdspecific-port760173886/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (489.646438ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1202 18:58:36.016497    4435 retry.go:31] will retry after 463.516077ms: exit status 1
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T /mount-9p | grep 9p"
2025/12/02 18:58:36 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdspecific-port760173886/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-224594 ssh "sudo umount -f /mount-9p": exit status 1 (337.863997ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-arm64 -p functional-224594 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdspecific-port760173886/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (2.33s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.08s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 version --short
--- PASS: TestFunctional/parallel/Version/short (0.08s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (1.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 version -o=json --components
functional_test.go:2275: (dbg) Done: out/minikube-linux-arm64 -p functional-224594 version -o=json --components: (1.350788484s)
--- PASS: TestFunctional/parallel/Version/components (1.35s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (2.33s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3645566539/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3645566539/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3645566539/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T" /mount1: exit status 1 (704.479265ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1202 18:58:38.569590    4435 retry.go:31] will retry after 428.461232ms: exit status 1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-224594 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3645566539/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3645566539/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-224594 /tmp/TestFunctionalparallelMountCmdVerifyCleanup3645566539/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (2.33s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.34s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-224594 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.34.2
registry.k8s.io/kube-proxy:v1.34.2
registry.k8s.io/kube-controller-manager:v1.34.2
registry.k8s.io/kube-apiserver:v1.34.2
registry.k8s.io/etcd:3.6.5-0
registry.k8s.io/coredns/coredns:v1.12.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/minikube-local-cache-test:functional-224594
docker.io/kindest/kindnetd:v20250512-df8de77b
docker.io/kicbase/echo-server:functional-224594
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-224594 image ls --format short --alsologtostderr:
I1202 18:58:45.087207   42003 out.go:360] Setting OutFile to fd 1 ...
I1202 18:58:45.087511   42003 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 18:58:45.087546   42003 out.go:374] Setting ErrFile to fd 2...
I1202 18:58:45.087568   42003 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 18:58:45.087905   42003 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 18:58:45.088691   42003 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1202 18:58:45.088875   42003 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1202 18:58:45.089490   42003 cli_runner.go:164] Run: docker container inspect functional-224594 --format={{.State.Status}}
I1202 18:58:45.122706   42003 ssh_runner.go:195] Run: systemctl --version
I1202 18:58:45.122772   42003 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-224594
I1202 18:58:45.149210   42003 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32783 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-224594/id_rsa Username:docker}
I1202 18:58:45.275913   42003 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.34s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-224594 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────────┬────────────────────┬───────────────┬────────┐
│                    IMAGE                    │        TAG         │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────────┼────────────────────┼───────────────┼────────┤
│ registry.k8s.io/coredns/coredns             │ v1.12.1            │ sha256:138784 │ 20.4MB │
│ registry.k8s.io/etcd                        │ 3.6.5-0            │ sha256:2c5f0d │ 21.1MB │
│ registry.k8s.io/kube-scheduler              │ v1.34.2            │ sha256:4f982e │ 15.8MB │
│ docker.io/kicbase/echo-server               │ functional-224594  │ sha256:ce2d2c │ 2.17MB │
│ gcr.io/k8s-minikube/storage-provisioner     │ v5                 │ sha256:ba04bb │ 8.03MB │
│ registry.k8s.io/kube-controller-manager     │ v1.34.2            │ sha256:1b3491 │ 20.7MB │
│ registry.k8s.io/kube-proxy                  │ v1.34.2            │ sha256:94bff1 │ 22.8MB │
│ registry.k8s.io/pause                       │ 3.3                │ sha256:3d1873 │ 249kB  │
│ docker.io/library/minikube-local-cache-test │ functional-224594  │ sha256:9d59d8 │ 992B   │
│ docker.io/library/nginx                     │ latest             │ sha256:bb747c │ 58.3MB │
│ registry.k8s.io/pause                       │ 3.1                │ sha256:8057e0 │ 262kB  │
│ registry.k8s.io/pause                       │ 3.10.1             │ sha256:d7b100 │ 268kB  │
│ registry.k8s.io/pause                       │ latest             │ sha256:8cb209 │ 71.3kB │
│ docker.io/library/nginx                     │ alpine             │ sha256:cbad63 │ 23.1MB │
│ registry.k8s.io/kube-apiserver              │ v1.34.2            │ sha256:b178af │ 24.6MB │
│ docker.io/kindest/kindnetd                  │ v20250512-df8de77b │ sha256:b1a8c6 │ 40.6MB │
│ gcr.io/k8s-minikube/busybox                 │ 1.28.4-glibc       │ sha256:1611cd │ 1.94MB │
└─────────────────────────────────────────────┴────────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-224594 image ls --format table --alsologtostderr:
I1202 18:58:45.743935   42179 out.go:360] Setting OutFile to fd 1 ...
I1202 18:58:45.744127   42179 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 18:58:45.744155   42179 out.go:374] Setting ErrFile to fd 2...
I1202 18:58:45.744175   42179 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 18:58:45.744671   42179 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 18:58:45.745842   42179 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1202 18:58:45.746028   42179 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1202 18:58:45.746637   42179 cli_runner.go:164] Run: docker container inspect functional-224594 --format={{.State.Status}}
I1202 18:58:45.773717   42179 ssh_runner.go:195] Run: systemctl --version
I1202 18:58:45.773780   42179 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-224594
I1202 18:58:45.801082   42179 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32783 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-224594/id_rsa Username:docker}
I1202 18:58:45.927736   42179 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.30s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.32s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-224594 image ls --format json --alsologtostderr:
[{"id":"sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c","repoDigests":["docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a"],"repoTags":["docker.io/kindest/kindnetd:v20250512-df8de77b"],"size":"40636774"},{"id":"sha256:9d59d8178e3a4c209f1c923737212d2bc54133c85455f4f8e051d069a9d30853","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-224594"],"size":"992"},{"id":"sha256:a422e0e982356f6c1cf0e5bb7b733363caae3992a07c99951fbcc73e58ed656a","repoDigests":["docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c"],"repoTags":[],"size":"18306114"},{"id":"sha256:b178af3d91f80925cd8bec42e1813e7d46370236a811d3380c9c10a02b245ca7","repoDigests":["registry.k8s.io/kube-apiserver@sha256:e009ef63deaf797763b5bd423d04a099a2fe414a081bf7d216b43bc9e76b9077"],"repoTags":["registry.k8s.io/kube-apiserver:v1.34.2"],"size":"24559643"},{"id":"sha256:d7b100cd9a77ba782c5e
428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":["registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c"],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"267939"},{"id":"sha256:bb747ca923a5e1139baddd6f4743e0c0c74df58f4ad8ddbc10ab183b92f5a5c7","repoDigests":["docker.io/library/nginx@sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42"],"repoTags":["docker.io/library/nginx:latest"],"size":"58263548"},{"id":"sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"8034419"},{"id":"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42","repoDigests":["registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534"],"repoTags":["registry.k8s.io/etcd:3.6.5-0"],"size":"21136588"},{"id":
"sha256:94bff1bec29fd04573941f362e44a6730b151d46df215613feb3f1167703f786","repoDigests":["registry.k8s.io/kube-proxy@sha256:d8b843ac8a5e861238df24a4db8c2ddced89948633400c4660464472045276f5"],"repoTags":["registry.k8s.io/kube-proxy:v1.34.2"],"size":"22802260"},{"id":"sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"249461"},{"id":"sha256:cbad6347cca28a6ee7b08793856bc6fcb2c2c7a377a62a5e6d785895c4194ac1","repoDigests":["docker.io/library/nginx@sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14"],"repoTags":["docker.io/library/nginx:alpine"],"size":"23117513"},{"id":"sha256:1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c","repoDigests":["gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e"],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"1935750"},{"id":"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc
","repoDigests":["registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c"],"repoTags":["registry.k8s.io/coredns/coredns:v1.12.1"],"size":"20392204"},{"id":"sha256:1b34917560f0916ad0d1e98debeaf98c640b68c5a38f6d87711f0e288e5d7be2","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:5c3998664b77441c09a4604f1361b230e63f7a6f299fc02fc1ebd1a12c38e3eb"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.34.2"],"size":"20718696"},{"id":"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"262191"},{"id":"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-224594"],"size":"2173567"},{"id":"sha256:20b332c9a70d8516d849d1ac23eff5800cbb2f263d379f0ec11ee908db6b25a8","repoDigests":["docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93"
],"repoTags":[],"size":"74084559"},{"id":"sha256:4f982e73e768a6ccebb54f8905b83b78d56b3a014e709c0bfe77140db3543949","repoDigests":["registry.k8s.io/kube-scheduler@sha256:44229946c0966b07d5c0791681d803e77258949985e49b4ab0fbdff99d2a48c6"],"repoTags":["registry.k8s.io/kube-scheduler:v1.34.2"],"size":"15775785"},{"id":"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"71300"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-224594 image ls --format json --alsologtostderr:
I1202 18:58:45.455431   42084 out.go:360] Setting OutFile to fd 1 ...
I1202 18:58:45.455600   42084 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 18:58:45.455613   42084 out.go:374] Setting ErrFile to fd 2...
I1202 18:58:45.455620   42084 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 18:58:45.455916   42084 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 18:58:45.456607   42084 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1202 18:58:45.456776   42084 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1202 18:58:45.457352   42084 cli_runner.go:164] Run: docker container inspect functional-224594 --format={{.State.Status}}
I1202 18:58:45.478647   42084 ssh_runner.go:195] Run: systemctl --version
I1202 18:58:45.478711   42084 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-224594
I1202 18:58:45.532915   42084 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32783 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-224594/id_rsa Username:docker}
I1202 18:58:45.643114   42084 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.32s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.33s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-224594 image ls --format yaml --alsologtostderr:
- id: sha256:cbad6347cca28a6ee7b08793856bc6fcb2c2c7a377a62a5e6d785895c4194ac1
repoDigests:
- docker.io/library/nginx@sha256:b3c656d55d7ad751196f21b7fd2e8d4da9cb430e32f646adcf92441b72f82b14
repoTags:
- docker.io/library/nginx:alpine
size: "23117513"
- id: sha256:bb747ca923a5e1139baddd6f4743e0c0c74df58f4ad8ddbc10ab183b92f5a5c7
repoDigests:
- docker.io/library/nginx@sha256:553f64aecdc31b5bf944521731cd70e35da4faed96b2b7548a3d8e2598c52a42
repoTags:
- docker.io/library/nginx:latest
size: "58263548"
- id: sha256:b178af3d91f80925cd8bec42e1813e7d46370236a811d3380c9c10a02b245ca7
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:e009ef63deaf797763b5bd423d04a099a2fe414a081bf7d216b43bc9e76b9077
repoTags:
- registry.k8s.io/kube-apiserver:v1.34.2
size: "24559643"
- id: sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "71300"
- id: sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-224594
size: "2173567"
- id: sha256:9d59d8178e3a4c209f1c923737212d2bc54133c85455f4f8e051d069a9d30853
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-224594
size: "992"
- id: sha256:94bff1bec29fd04573941f362e44a6730b151d46df215613feb3f1167703f786
repoDigests:
- registry.k8s.io/kube-proxy@sha256:d8b843ac8a5e861238df24a4db8c2ddced89948633400c4660464472045276f5
repoTags:
- registry.k8s.io/kube-proxy:v1.34.2
size: "22802260"
- id: sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "249461"
- id: sha256:b1a8c6f707935fd5f346ce5846d21ff8dd65e14c15406a14dbd16b9b897b9b4c
repoDigests:
- docker.io/kindest/kindnetd@sha256:07a4b3fe0077a0ae606cc0a200fc25a28fa64dcc30b8d311b461089969449f9a
repoTags:
- docker.io/kindest/kindnetd:v20250512-df8de77b
size: "40636774"
- id: sha256:20b332c9a70d8516d849d1ac23eff5800cbb2f263d379f0ec11ee908db6b25a8
repoDigests:
- docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93
repoTags: []
size: "74084559"
- id: sha256:1611cd07b61d57dbbfebe6db242513fd51e1c02d20ba08af17a45837d86a8a8c
repoDigests:
- gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "1935750"
- id: sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c
repoTags:
- registry.k8s.io/coredns/coredns:v1.12.1
size: "20392204"
- id: sha256:4f982e73e768a6ccebb54f8905b83b78d56b3a014e709c0bfe77140db3543949
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:44229946c0966b07d5c0791681d803e77258949985e49b4ab0fbdff99d2a48c6
repoTags:
- registry.k8s.io/kube-scheduler:v1.34.2
size: "15775785"
- id: sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests:
- registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c
repoTags:
- registry.k8s.io/pause:3.10.1
size: "267939"
- id: sha256:a422e0e982356f6c1cf0e5bb7b733363caae3992a07c99951fbcc73e58ed656a
repoDigests:
- docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c
repoTags: []
size: "18306114"
- id: sha256:ba04bb24b95753201135cbc420b233c1b0b9fa2e1fd21d28319c348c33fbcde6
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "8034419"
- id: sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42
repoDigests:
- registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534
repoTags:
- registry.k8s.io/etcd:3.6.5-0
size: "21136588"
- id: sha256:1b34917560f0916ad0d1e98debeaf98c640b68c5a38f6d87711f0e288e5d7be2
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:5c3998664b77441c09a4604f1361b230e63f7a6f299fc02fc1ebd1a12c38e3eb
repoTags:
- registry.k8s.io/kube-controller-manager:v1.34.2
size: "20718696"
- id: sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "262191"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-224594 image ls --format yaml --alsologtostderr:
I1202 18:58:45.125358   42008 out.go:360] Setting OutFile to fd 1 ...
I1202 18:58:45.125585   42008 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 18:58:45.125601   42008 out.go:374] Setting ErrFile to fd 2...
I1202 18:58:45.125609   42008 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 18:58:45.126035   42008 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 18:58:45.126856   42008 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1202 18:58:45.127037   42008 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1202 18:58:45.127725   42008 cli_runner.go:164] Run: docker container inspect functional-224594 --format={{.State.Status}}
I1202 18:58:45.155952   42008 ssh_runner.go:195] Run: systemctl --version
I1202 18:58:45.156018   42008 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-224594
I1202 18:58:45.186463   42008 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32783 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-224594/id_rsa Username:docker}
I1202 18:58:45.312486   42008 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.33s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (4s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-224594 ssh pgrep buildkitd: exit status 1 (418.655528ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image build -t localhost/my-image:functional-224594 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-224594 image build -t localhost/my-image:functional-224594 testdata/build --alsologtostderr: (3.355755213s)
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-224594 image build -t localhost/my-image:functional-224594 testdata/build --alsologtostderr:
I1202 18:58:45.803843   42185 out.go:360] Setting OutFile to fd 1 ...
I1202 18:58:45.804019   42185 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 18:58:45.804024   42185 out.go:374] Setting ErrFile to fd 2...
I1202 18:58:45.804029   42185 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 18:58:45.804264   42185 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 18:58:45.804852   42185 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1202 18:58:45.809346   42185 config.go:182] Loaded profile config "functional-224594": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
I1202 18:58:45.809882   42185 cli_runner.go:164] Run: docker container inspect functional-224594 --format={{.State.Status}}
I1202 18:58:45.829947   42185 ssh_runner.go:195] Run: systemctl --version
I1202 18:58:45.829997   42185 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-224594
I1202 18:58:45.848953   42185 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32783 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-224594/id_rsa Username:docker}
I1202 18:58:45.965375   42185 build_images.go:162] Building image from path: /tmp/build.2558885031.tar
I1202 18:58:45.965441   42185 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1202 18:58:45.979728   42185 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.2558885031.tar
I1202 18:58:45.990601   42185 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.2558885031.tar: stat -c "%s %y" /var/lib/minikube/build/build.2558885031.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.2558885031.tar': No such file or directory
I1202 18:58:45.990642   42185 ssh_runner.go:362] scp /tmp/build.2558885031.tar --> /var/lib/minikube/build/build.2558885031.tar (3072 bytes)
I1202 18:58:46.015551   42185 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.2558885031
I1202 18:58:46.024154   42185 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.2558885031 -xf /var/lib/minikube/build/build.2558885031.tar
I1202 18:58:46.032594   42185 containerd.go:394] Building image: /var/lib/minikube/build/build.2558885031
I1202 18:58:46.032667   42185 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.2558885031 --local dockerfile=/var/lib/minikube/build/build.2558885031 --output type=image,name=localhost/my-image:functional-224594
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.4s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B 0.0s done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.0s done
#5 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0B / 828.50kB 0.2s
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 828.50kB / 828.50kB 0.6s done
#5 extracting sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0.1s done
#5 DONE 0.7s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.5s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.1s done
#8 exporting manifest sha256:48047a96c40c39e6a44225d63050c93cdd338f0237ba5b44e08feee95c8f9fa4
#8 exporting manifest sha256:48047a96c40c39e6a44225d63050c93cdd338f0237ba5b44e08feee95c8f9fa4 0.0s done
#8 exporting config sha256:cc7cc2b90efaa0e403b546bb5742304a0016c4948ffc40618ee7e6d62d1aae04 0.0s done
#8 naming to localhost/my-image:functional-224594 done
#8 DONE 0.1s
I1202 18:58:49.067866   42185 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.2558885031 --local dockerfile=/var/lib/minikube/build/build.2558885031 --output type=image,name=localhost/my-image:functional-224594: (3.035176101s)
I1202 18:58:49.067930   42185 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.2558885031
I1202 18:58:49.076696   42185 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.2558885031.tar
I1202 18:58:49.084548   42185 build_images.go:218] Built localhost/my-image:functional-224594 from /tmp/build.2558885031.tar
I1202 18:58:49.084628   42185 build_images.go:134] succeeded building to: functional-224594
I1202 18:58:49.084639   42185 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (4.00s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (0.69s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-224594
--- PASS: TestFunctional/parallel/ImageCommands/Setup (0.69s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image load --daemon kicbase/echo-server:functional-224594 --alsologtostderr
functional_test.go:370: (dbg) Done: out/minikube-linux-arm64 -p functional-224594 image load --daemon kicbase/echo-server:functional-224594 --alsologtostderr: (1.043155488s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.30s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.34s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image load --daemon kicbase/echo-server:functional-224594 --alsologtostderr
functional_test.go:380: (dbg) Done: out/minikube-linux-arm64 -p functional-224594 image load --daemon kicbase/echo-server:functional-224594 --alsologtostderr: (1.068194371s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.34s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.61s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-224594
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image load --daemon kicbase/echo-server:functional-224594 --alsologtostderr
functional_test.go:260: (dbg) Done: out/minikube-linux-arm64 -p functional-224594 image load --daemon kicbase/echo-server:functional-224594 --alsologtostderr: (1.035443664s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.61s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.20s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.28s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.43s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image save kicbase/echo-server:functional-224594 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.43s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image rm kicbase/echo-server:functional-224594 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.7s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.70s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.48s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-224594
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-224594 image save --daemon kicbase/echo-server:functional-224594 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect kicbase/echo-server:functional-224594
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.48s)

                                                
                                    
x
+
TestFunctional/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctional/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-224594
--- PASS: TestFunctional/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-224594
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-224594
--- PASS: TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile
functional_test.go:1860: local sync path: /home/jenkins/minikube-integration/22021-2487/.minikube/files/etc/test/nested/copy/4435/hosts
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext (0.07s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext
functional_test.go:696: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/KubeContext (0.07s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote (3.39s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 cache add registry.k8s.io/pause:3.1
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-449836 cache add registry.k8s.io/pause:3.1: (1.168291784s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 cache add registry.k8s.io/pause:3.3
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-449836 cache add registry.k8s.io/pause:3.3: (1.177993468s)
functional_test.go:1064: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 cache add registry.k8s.io/pause:latest
functional_test.go:1064: (dbg) Done: out/minikube-linux-arm64 -p functional-449836 cache add registry.k8s.io/pause:latest: (1.046389172s)
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_remote (3.39s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local (1.08s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local
functional_test.go:1092: (dbg) Run:  docker build -t minikube-local-cache-test:functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0serialCach3340575019/001
functional_test.go:1104: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 cache add minikube-local-cache-test:functional-449836
functional_test.go:1109: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 cache delete minikube-local-cache-test:functional-449836
functional_test.go:1098: (dbg) Run:  docker rmi minikube-local-cache-test:functional-449836
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/add_local (1.08s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete
functional_test.go:1117: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list (0.05s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list
functional_test.go:1125: (dbg) Run:  out/minikube-linux-arm64 cache list
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/list (0.05s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node (0.29s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1139: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh sudo crictl images
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/verify_cache_inside_node (0.29s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload (1.83s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload
functional_test.go:1162: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1168: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (287.582416ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1173: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 cache reload
functional_test.go:1178: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/cache_reload (1.83s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete (0.12s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1187: (dbg) Run:  out/minikube-linux-arm64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/CacheCmd/cache/delete (0.12s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd (1.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd
functional_test.go:1251: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs
functional_test.go:1251: (dbg) Done: out/minikube-linux-arm64 -p functional-449836 logs: (1.250753421s)
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsCmd (1.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd (1.03s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd
functional_test.go:1265: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 logs --file /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0serialLogs1283706137/001/logs.txt
functional_test.go:1265: (dbg) Done: out/minikube-linux-arm64 -p functional-449836 logs --file /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0serialLogs1283706137/001/logs.txt: (1.022826819s)
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/serial/LogsFileCmd (1.03s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd (0.41s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 config get cpus: exit status 14 (59.264241ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 config set cpus 2
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 config get cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 config unset cpus
functional_test.go:1214: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 config get cpus
functional_test.go:1214: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 config get cpus: exit status 14 (64.398651ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ConfigCmd (0.41s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun (0.47s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun
functional_test.go:989: (dbg) Run:  out/minikube-linux-arm64 start -p functional-449836 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
functional_test.go:989: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-449836 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0: exit status 23 (183.963936ms)

                                                
                                                
-- stdout --
	* [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22021
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the docker driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1202 19:27:54.154522   72030 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:27:54.154648   72030 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:27:54.154683   72030 out.go:374] Setting ErrFile to fd 2...
	I1202 19:27:54.154696   72030 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:27:54.154950   72030 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:27:54.155309   72030 out.go:368] Setting JSON to false
	I1202 19:27:54.156089   72030 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":4211,"bootTime":1764699464,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:27:54.156164   72030 start.go:143] virtualization:  
	I1202 19:27:54.159381   72030 out.go:179] * [functional-449836] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	I1202 19:27:54.162416   72030 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:27:54.162486   72030 notify.go:221] Checking for updates...
	I1202 19:27:54.168732   72030 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:27:54.171666   72030 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:27:54.174565   72030 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:27:54.177453   72030 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:27:54.180131   72030 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:27:54.183419   72030 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:27:54.184046   72030 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:27:54.216677   72030 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:27:54.216797   72030 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:27:54.270378   72030 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:27:54.26076247 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:aa
rch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Pa
th:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:27:54.270491   72030 docker.go:319] overlay module found
	I1202 19:27:54.273520   72030 out.go:179] * Using the docker driver based on existing profile
	I1202 19:27:54.276440   72030 start.go:309] selected driver: docker
	I1202 19:27:54.276463   72030 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:27:54.276568   72030 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:27:54.280071   72030 out.go:203] 
	W1202 19:27:54.282913   72030 out.go:285] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I1202 19:27:54.285850   72030 out.go:203] 

                                                
                                                
** /stderr **
functional_test.go:1006: (dbg) Run:  out/minikube-linux-arm64 start -p functional-449836 --dry-run --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DryRun (0.47s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage (0.2s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage
functional_test.go:1035: (dbg) Run:  out/minikube-linux-arm64 start -p functional-449836 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0
functional_test.go:1035: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p functional-449836 --dry-run --memory 250MB --alsologtostderr --driver=docker  --container-runtime=containerd --kubernetes-version=v1.35.0-beta.0: exit status 23 (203.23523ms)

                                                
                                                
-- stdout --
	* [functional-449836] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22021
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote docker basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1202 19:27:53.957697   71984 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:27:53.957863   71984 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:27:53.957875   71984 out.go:374] Setting ErrFile to fd 2...
	I1202 19:27:53.957882   71984 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:27:53.958282   71984 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:27:53.958718   71984 out.go:368] Setting JSON to false
	I1202 19:27:53.959515   71984 start.go:133] hostinfo: {"hostname":"ip-172-31-31-251","uptime":4210,"bootTime":1764699464,"procs":157,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1084-aws","kernelArch":"aarch64","virtualizationSystem":"","virtualizationRole":"","hostId":"982e3628-3742-4b3e-bb63-ac1b07660ec7"}
	I1202 19:27:53.959587   71984 start.go:143] virtualization:  
	I1202 19:27:53.962892   71984 out.go:179] * [functional-449836] minikube v1.37.0 sur Ubuntu 20.04 (arm64)
	I1202 19:27:53.967274   71984 out.go:179]   - MINIKUBE_LOCATION=22021
	I1202 19:27:53.967368   71984 notify.go:221] Checking for updates...
	I1202 19:27:53.975558   71984 out.go:179]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I1202 19:27:53.978576   71984 out.go:179]   - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	I1202 19:27:53.981505   71984 out.go:179]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	I1202 19:27:53.984471   71984 out.go:179]   - MINIKUBE_BIN=out/minikube-linux-arm64
	I1202 19:27:53.987486   71984 out.go:179]   - MINIKUBE_FORCE_SYSTEMD=
	I1202 19:27:53.990918   71984 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
	I1202 19:27:53.991524   71984 driver.go:422] Setting default libvirt URI to qemu:///system
	I1202 19:27:54.021817   71984 docker.go:124] docker version: linux-28.1.1:Docker Engine - Community
	I1202 19:27:54.021931   71984 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:27:54.085863   71984 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:36 OomKillDisable:true NGoroutines:52 SystemTime:2025-12-02 19:27:54.076494696 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:27:54.085967   71984 docker.go:319] overlay module found
	I1202 19:27:54.089102   71984 out.go:179] * Utilisation du pilote docker basé sur le profil existant
	I1202 19:27:54.091821   71984 start.go:309] selected driver: docker
	I1202 19:27:54.091840   71984 start.go:927] validating driver "docker" against &{Name:functional-449836 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.48-1764169655-21974@sha256:5caa2df9c71885b15a10c4769bf4c9c00c1759c0d87b1a7e0b5b61285526245b Memory:4096 CPUs:2 DiskSize:20000 Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.35.0-beta.0 ClusterName:functional-449836 Namespace:default APIServerHAVIP: APIS
erverName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.35.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s MountString: Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p Mou
ntUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false DisableCoreDNSLog:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I1202 19:27:54.091947   71984 start.go:938] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I1202 19:27:54.095520   71984 out.go:203] 
	W1202 19:27:54.098482   71984 out.go:285] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I1202 19:27:54.101407   71984 out.go:203] 

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/InternationalLanguage (0.20s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd (0.14s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd
functional_test.go:1695: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 addons list
functional_test.go:1707: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 addons list -o json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/AddonsCmd (0.14s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd (0.71s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd
functional_test.go:1730: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "echo hello"
functional_test.go:1747: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "cat /etc/hostname"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/SSHCmd (0.71s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd (2.22s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh -n functional-449836 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 cp functional-449836:/home/docker/cp-test.txt /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelCp3136267943/001/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh -n functional-449836 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh -n functional-449836 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CpCmd (2.22s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync (0.28s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync
functional_test.go:1934: Checking for existence of /etc/test/nested/copy/4435/hosts within VM
functional_test.go:1936: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo cat /etc/test/nested/copy/4435/hosts"
functional_test.go:1941: file sync test content: Test file for checking file sync process
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/FileSync (0.28s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync (1.72s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync
functional_test.go:1977: Checking for existence of /etc/ssl/certs/4435.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo cat /etc/ssl/certs/4435.pem"
functional_test.go:1977: Checking for existence of /usr/share/ca-certificates/4435.pem within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo cat /usr/share/ca-certificates/4435.pem"
functional_test.go:1977: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1978: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/44352.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo cat /etc/ssl/certs/44352.pem"
functional_test.go:2004: Checking for existence of /usr/share/ca-certificates/44352.pem within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo cat /usr/share/ca-certificates/44352.pem"
functional_test.go:2004: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:2005: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/CertSync (1.72s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled (0.53s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo systemctl is-active docker"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 ssh "sudo systemctl is-active docker": exit status 1 (271.850795ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2032: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo systemctl is-active crio"
functional_test.go:2032: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 ssh "sudo systemctl is-active crio": exit status 1 (260.240253ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/NonActiveRuntimeDisabled (0.53s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License (0.19s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License
functional_test.go:2293: (dbg) Run:  out/minikube-linux-arm64 license
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/License (0.19s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-linux-arm64 -p functional-449836 tunnel --alsologtostderr]
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/StartTunnel (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel (0.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-linux-arm64 -p functional-449836 tunnel --alsologtostderr] ...
functional_test_tunnel_test.go:437: failed to stop process: exit status 103
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DeleteTunnel (0.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create (0.39s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create
functional_test.go:1285: (dbg) Run:  out/minikube-linux-arm64 profile lis
functional_test.go:1290: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_not_create (0.39s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list (0.4s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list
functional_test.go:1325: (dbg) Run:  out/minikube-linux-arm64 profile list
functional_test.go:1330: Took "346.487424ms" to run "out/minikube-linux-arm64 profile list"
functional_test.go:1339: (dbg) Run:  out/minikube-linux-arm64 profile list -l
functional_test.go:1344: Took "51.671098ms" to run "out/minikube-linux-arm64 profile list -l"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_list (0.40s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output (0.38s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output
functional_test.go:1376: (dbg) Run:  out/minikube-linux-arm64 profile list -o json
functional_test.go:1381: Took "331.302173ms" to run "out/minikube-linux-arm64 profile list -o json"
functional_test.go:1389: (dbg) Run:  out/minikube-linux-arm64 profile list -o json --light
functional_test.go:1394: Took "52.669817ms" to run "out/minikube-linux-arm64 profile list -o json --light"
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ProfileCmd/profile_json_output (0.38s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port (2.1s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo265742512/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (365.454523ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1202 19:27:47.222524    4435 retry.go:31] will retry after 704.134606ms: exit status 1
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo265742512/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 ssh "sudo umount -f /mount-9p": exit status 1 (272.907799ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-arm64 -p functional-449836 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo265742512/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/specific-port (2.10s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup (1.85s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T" /mount1: exit status 1 (607.99625ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
I1202 19:27:49.567487    4435 retry.go:31] will retry after 342.197489ms: exit status 1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-arm64 mount -p functional-449836 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-arm64 mount -p functional-449836 /tmp/TestFunctionalNewestKubernetesVersionv1.35.0-beta.0parallelMo681912831/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:507: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MountCmd/VerifyCleanup (1.85s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short (0.08s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short
functional_test.go:2261: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 version --short
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/short (0.08s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components (0.48s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components
functional_test.go:2275: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 version -o=json --components
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/Version/components (0.48s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort (0.23s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image ls --format short --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-449836 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.10.1
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.35.0-beta.0
registry.k8s.io/kube-proxy:v1.35.0-beta.0
registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
registry.k8s.io/kube-apiserver:v1.35.0-beta.0
registry.k8s.io/etcd:3.6.5-0
registry.k8s.io/coredns/coredns:v1.13.1
gcr.io/k8s-minikube/storage-provisioner:v5
docker.io/library/minikube-local-cache-test:functional-449836
docker.io/kicbase/echo-server:functional-449836
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-449836 image ls --format short --alsologtostderr:
I1202 19:28:06.966369   74200 out.go:360] Setting OutFile to fd 1 ...
I1202 19:28:06.966826   74200 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:28:06.966845   74200 out.go:374] Setting ErrFile to fd 2...
I1202 19:28:06.966852   74200 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:28:06.967524   74200 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 19:28:06.971733   74200 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:28:06.971876   74200 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:28:06.972443   74200 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
I1202 19:28:06.990089   74200 ssh_runner.go:195] Run: systemctl --version
I1202 19:28:06.990146   74200 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
I1202 19:28:07.006884   74200 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
I1202 19:28:07.110770   74200 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListShort (0.23s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable (0.22s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image ls --format table --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-449836 image ls --format table --alsologtostderr:
┌─────────────────────────────────────────────┬───────────────────┬───────────────┬────────┐
│                    IMAGE                    │        TAG        │   IMAGE ID    │  SIZE  │
├─────────────────────────────────────────────┼───────────────────┼───────────────┼────────┤
│ docker.io/library/minikube-local-cache-test │ functional-449836 │ sha256:9d59d8 │ 992B   │
│ gcr.io/k8s-minikube/storage-provisioner     │ v5                │ sha256:667491 │ 8.03MB │
│ localhost/my-image                          │ functional-449836 │ sha256:a30433 │ 831kB  │
│ registry.k8s.io/kube-apiserver              │ v1.35.0-beta.0    │ sha256:ccd634 │ 24.7MB │
│ registry.k8s.io/kube-controller-manager     │ v1.35.0-beta.0    │ sha256:68b5f7 │ 20.7MB │
│ registry.k8s.io/kube-proxy                  │ v1.35.0-beta.0    │ sha256:404c2e │ 22.4MB │
│ registry.k8s.io/pause                       │ 3.10.1            │ sha256:d7b100 │ 265kB  │
│ registry.k8s.io/coredns/coredns             │ v1.13.1           │ sha256:e08f4d │ 21.2MB │
│ registry.k8s.io/etcd                        │ 3.6.5-0           │ sha256:2c5f0d │ 21.1MB │
│ registry.k8s.io/kube-scheduler              │ v1.35.0-beta.0    │ sha256:163787 │ 15.4MB │
│ registry.k8s.io/pause                       │ 3.1               │ sha256:8057e0 │ 262kB  │
│ registry.k8s.io/pause                       │ 3.3               │ sha256:3d1873 │ 249kB  │
│ registry.k8s.io/pause                       │ latest            │ sha256:8cb209 │ 71.3kB │
│ docker.io/kicbase/echo-server               │ functional-449836 │ sha256:ce2d2c │ 2.17MB │
└─────────────────────────────────────────────┴───────────────────┴───────────────┴────────┘
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-449836 image ls --format table --alsologtostderr:
I1202 19:28:11.347193   74594 out.go:360] Setting OutFile to fd 1 ...
I1202 19:28:11.347379   74594 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:28:11.347415   74594 out.go:374] Setting ErrFile to fd 2...
I1202 19:28:11.347429   74594 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:28:11.347708   74594 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 19:28:11.348371   74594 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:28:11.348536   74594 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:28:11.349094   74594 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
I1202 19:28:11.366255   74594 ssh_runner.go:195] Run: systemctl --version
I1202 19:28:11.366322   74594 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
I1202 19:28:11.384035   74594 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
I1202 19:28:11.487201   74594 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListTable (0.22s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson (0.22s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image ls --format json --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-449836 image ls --format json --alsologtostderr:
[{"id":"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.10.1"],"size":"265458"},{"id":"sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"71300"},{"id":"sha256:9d59d8178e3a4c209f1c923737212d2bc54133c85455f4f8e051d069a9d30853","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-449836"],"size":"992"},{"id":"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf","repoDigests":[],"repoTags":["registry.k8s.io/coredns/coredns:v1.13.1"],"size":"21166088"},{"id":"sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.6.5-0"],"size":"21134420"},{"id":"sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.35.0-beta.0"],"size":"24676285"},{"id
":"sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.35.0-beta.0"],"size":"15389290"},{"id":"sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"249461"},{"id":"sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-449836"],"size":"2173567"},{"id":"sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"8032639"},{"id":"sha256:a30433d368d66ec77b37b7bf10ff31a359d75890cf1752556ca5a3a2233accc8","repoDigests":[],"repoTags":["localhost/my-image:functional-449836"],"size":"830617"},{"id":"sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.35.0-beta.0"],"size":"2065
8969"},{"id":"sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904","repoDigests":[],"repoTags":["registry.k8s.io/kube-proxy:v1.35.0-beta.0"],"size":"22428165"},{"id":"sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"262191"}]
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-449836 image ls --format json --alsologtostderr:
I1202 19:28:11.127646   74559 out.go:360] Setting OutFile to fd 1 ...
I1202 19:28:11.127769   74559 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:28:11.127779   74559 out.go:374] Setting ErrFile to fd 2...
I1202 19:28:11.127785   74559 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:28:11.128082   74559 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 19:28:11.128752   74559 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:28:11.128877   74559 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:28:11.129445   74559 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
I1202 19:28:11.147552   74559 ssh_runner.go:195] Run: systemctl --version
I1202 19:28:11.147623   74559 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
I1202 19:28:11.165354   74559 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
I1202 19:28:11.267310   74559 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListJson (0.22s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml (0.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml
functional_test.go:276: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image ls --format yaml --alsologtostderr
functional_test.go:281: (dbg) Stdout: out/minikube-linux-arm64 -p functional-449836 image ls --format yaml --alsologtostderr:
- id: sha256:9d59d8178e3a4c209f1c923737212d2bc54133c85455f4f8e051d069a9d30853
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-449836
size: "992"
- id: sha256:66749159455b3f08c8318fe0233122f54d0f5889f9c5fdfb73c3fd9d99895b51
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "8032639"
- id: sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.13.1
size: "21166088"
- id: sha256:ccd634d9bcc36ac6235e9c86676cd3a02c06afc3788a25f1bbf39ca7d44585f4
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.35.0-beta.0
size: "24676285"
- id: sha256:68b5f775f18769fcb77bd8474c80bda2050163b6c66f4551f352b7381b8ca5be
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.35.0-beta.0
size: "20658969"
- id: sha256:16378741539f1be9c6e347d127537d379a6592587b09b4eb47964cb5c43a409b
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.35.0-beta.0
size: "15389290"
- id: sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.10.1
size: "265458"
- id: sha256:3d18732f8686cc3c878055d99a05fa80289502fa496b36b6a0fe0f77206a7300
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "249461"
- id: sha256:ce2d2cda2d858fdaea84129deb86d18e5dbf1c548f230b79fdca74cc91729d17
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-449836
size: "2173567"
- id: sha256:2c5f0dedd21c25ec3a6709934d22152d53ec50fe57b72d29e4450655e3d14d42
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.6.5-0
size: "21134420"
- id: sha256:404c2e12861777b763b8feaa316d36680fc68ad308a8d2f6e55f1bb981cdd904
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.35.0-beta.0
size: "22428165"
- id: sha256:8057e0500773a37cde2cff041eb13ebd68c748419a2fbfd1dfb5bf38696cc8e5
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "262191"
- id: sha256:8cb2091f603e75187e2f6226c5901d12e00b1d1f778c6471ae4578e8a1c4724a
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "71300"

                                                
                                                
functional_test.go:284: (dbg) Stderr: out/minikube-linux-arm64 -p functional-449836 image ls --format yaml --alsologtostderr:
I1202 19:28:07.220483   74237 out.go:360] Setting OutFile to fd 1 ...
I1202 19:28:07.220586   74237 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:28:07.220596   74237 out.go:374] Setting ErrFile to fd 2...
I1202 19:28:07.220601   74237 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:28:07.220970   74237 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 19:28:07.221895   74237 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:28:07.222035   74237 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:28:07.222748   74237 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
I1202 19:28:07.247084   74237 ssh_runner.go:195] Run: systemctl --version
I1202 19:28:07.247156   74237 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
I1202 19:28:07.264959   74237 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
I1202 19:28:07.366915   74237 ssh_runner.go:195] Run: sudo crictl images --output json
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageListYaml (0.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild (3.68s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild
functional_test.go:323: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 ssh pgrep buildkitd
functional_test.go:323: (dbg) Non-zero exit: out/minikube-linux-arm64 -p functional-449836 ssh pgrep buildkitd: exit status 1 (270.902971ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:330: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image build -t localhost/my-image:functional-449836 testdata/build --alsologtostderr
functional_test.go:330: (dbg) Done: out/minikube-linux-arm64 -p functional-449836 image build -t localhost/my-image:functional-449836 testdata/build --alsologtostderr: (3.162668666s)
functional_test.go:338: (dbg) Stderr: out/minikube-linux-arm64 -p functional-449836 image build -t localhost/my-image:functional-449836 testdata/build --alsologtostderr:
I1202 19:28:07.721787   74344 out.go:360] Setting OutFile to fd 1 ...
I1202 19:28:07.721943   74344 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:28:07.721957   74344 out.go:374] Setting ErrFile to fd 2...
I1202 19:28:07.721964   74344 out.go:408] TERM=,COLORTERM=, which probably does not support color
I1202 19:28:07.722238   74344 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
I1202 19:28:07.722859   74344 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:28:07.723583   74344 config.go:182] Loaded profile config "functional-449836": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.35.0-beta.0
I1202 19:28:07.724172   74344 cli_runner.go:164] Run: docker container inspect functional-449836 --format={{.State.Status}}
I1202 19:28:07.742685   74344 ssh_runner.go:195] Run: systemctl --version
I1202 19:28:07.742741   74344 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-449836
I1202 19:28:07.762447   74344 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32788 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/functional-449836/id_rsa Username:docker}
I1202 19:28:07.866860   74344 build_images.go:162] Building image from path: /tmp/build.17599393.tar
I1202 19:28:07.866977   74344 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I1202 19:28:07.875038   74344 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.17599393.tar
I1202 19:28:07.879006   74344 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.17599393.tar: stat -c "%s %y" /var/lib/minikube/build/build.17599393.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.17599393.tar': No such file or directory
I1202 19:28:07.879040   74344 ssh_runner.go:362] scp /tmp/build.17599393.tar --> /var/lib/minikube/build/build.17599393.tar (3072 bytes)
I1202 19:28:07.896930   74344 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.17599393
I1202 19:28:07.906776   74344 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.17599393 -xf /var/lib/minikube/build/build.17599393.tar
I1202 19:28:07.915110   74344 containerd.go:394] Building image: /var/lib/minikube/build/build.17599393
I1202 19:28:07.915183   74344 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.17599393 --local dockerfile=/var/lib/minikube/build/build.17599393 --output type=image,name=localhost/my-image:functional-449836
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.7s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.0s done
#5 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0B / 828.50kB 0.2s
#5 sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 828.50kB / 828.50kB 0.4s done
#5 extracting sha256:a01966dde7f8d5ba10b6d87e776c7c8fb5a5f6bfa678874bd28b33b1fc6dba34 0.1s done
#5 DONE 0.5s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.2s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.1s done
#8 exporting manifest sha256:9ec6b23df07bad1326ee7e0ca5d46eb1692c761d73943615223eea2d9d9b686d
#8 exporting manifest sha256:9ec6b23df07bad1326ee7e0ca5d46eb1692c761d73943615223eea2d9d9b686d 0.0s done
#8 exporting config sha256:a30433d368d66ec77b37b7bf10ff31a359d75890cf1752556ca5a3a2233accc8 0.0s done
#8 naming to localhost/my-image:functional-449836 done
#8 DONE 0.2s
I1202 19:28:10.810255   74344 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.17599393 --local dockerfile=/var/lib/minikube/build/build.17599393 --output type=image,name=localhost/my-image:functional-449836: (2.895041869s)
I1202 19:28:10.810331   74344 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.17599393
I1202 19:28:10.818500   74344 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.17599393.tar
I1202 19:28:10.825954   74344 build_images.go:218] Built localhost/my-image:functional-449836 from /tmp/build.17599393.tar
I1202 19:28:10.825983   74344 build_images.go:134] succeeded building to: functional-449836
I1202 19:28:10.825989   74344 build_images.go:135] failed building to: 
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageBuild (3.68s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup (0.25s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup
functional_test.go:357: (dbg) Run:  docker pull kicbase/echo-server:1.0
functional_test.go:362: (dbg) Run:  docker tag kicbase/echo-server:1.0 kicbase/echo-server:functional-449836
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/Setup (0.25s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon (1.12s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:370: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image load --daemon kicbase/echo-server:functional-449836 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadDaemon (1.12s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon (1.09s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:380: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image load --daemon kicbase/echo-server:functional-449836 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageReloadDaemon (1.09s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon (1.6s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:250: (dbg) Run:  docker pull kicbase/echo-server:latest
functional_test.go:255: (dbg) Run:  docker tag kicbase/echo-server:latest kicbase/echo-server:functional-449836
functional_test.go:260: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image load --daemon kicbase/echo-server:functional-449836 --alsologtostderr
functional_test.go:260: (dbg) Done: out/minikube-linux-arm64 -p functional-449836 image load --daemon kicbase/echo-server:functional-449836 --alsologtostderr: (1.111266319s)
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageTagAndLoadDaemon (1.60s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile (0.34s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile
functional_test.go:395: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image save kicbase/echo-server:functional-449836 /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveToFile (0.34s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove (0.48s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove
functional_test.go:407: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image rm kicbase/echo-server:functional-449836 --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageRemove (0.48s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile (0.68s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:424: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image load /home/jenkins/workspace/Docker_Linux_containerd_arm64/echo-server-save.tar --alsologtostderr
functional_test.go:466: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image ls
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageLoadFromFile (0.68s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon (0.38s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:434: (dbg) Run:  docker rmi kicbase/echo-server:functional-449836
functional_test.go:439: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 image save --daemon kicbase/echo-server:functional-449836 --alsologtostderr
functional_test.go:447: (dbg) Run:  docker image inspect kicbase/echo-server:functional-449836
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/ImageCommands/ImageSaveDaemon (0.38s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes (0.14s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_changes (0.14s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters (0.17s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters
functional_test.go:2124: (dbg) Run:  out/minikube-linux-arm64 -p functional-449836 update-context --alsologtostderr -v=2
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/UpdateContextCmd/no_clusters (0.17s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images (0.05s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:1.0
functional_test.go:205: (dbg) Run:  docker rmi -f kicbase/echo-server:functional-449836
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_echo-server_images (0.05s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image
functional_test.go:213: (dbg) Run:  docker rmi -f localhost/my-image:functional-449836
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images (0.01s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images
functional_test.go:221: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-449836
--- PASS: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/delete_minikube_cached_images (0.01s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StartCluster (197.71s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd
E1202 19:30:46.066462    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:03.870121    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:03.876421    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:03.887759    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:03.909097    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:03.950403    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:04.031759    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:04.193204    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:04.514852    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:05.156896    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:06.438210    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:09.000446    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:14.122380    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:24.364042    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:31:44.845964    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:32:25.808495    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:33:00.703418    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:101: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 start --ha --memory 3072 --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd: (3m16.762032708s)
ha_test.go:107: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/StartCluster (197.71s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeployApp (7.6s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- rollout status deployment/busybox
ha_test.go:133: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 kubectl -- rollout status deployment/busybox: (4.660944364s)
ha_test.go:140: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:163: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-7dpqq -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-hkfjm -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-t65fh -- nslookup kubernetes.io
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-7dpqq -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-hkfjm -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-t65fh -- nslookup kubernetes.default
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-7dpqq -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-hkfjm -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-t65fh -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiControlPlane/serial/DeployApp (7.60s)

                                                
                                    
x
+
TestMultiControlPlane/serial/PingHostFromPods (1.68s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-7dpqq -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-7dpqq -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-hkfjm -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-hkfjm -- sh -c "ping -c 1 192.168.49.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-t65fh -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 kubectl -- exec busybox-7b57f96db7-t65fh -- sh -c "ping -c 1 192.168.49.1"
--- PASS: TestMultiControlPlane/serial/PingHostFromPods (1.68s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddWorkerNode (60.11s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 node add --alsologtostderr -v 5
E1202 19:33:47.732638    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:228: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 node add --alsologtostderr -v 5: (59.034874271s)
ha_test.go:234: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5
ha_test.go:234: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5: (1.077867085s)
--- PASS: TestMultiControlPlane/serial/AddWorkerNode (60.11s)

                                                
                                    
x
+
TestMultiControlPlane/serial/NodeLabels (0.11s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-394617 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiControlPlane/serial/NodeLabels (0.11s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterClusterStart (1.1s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.104215726s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterClusterStart (1.10s)

                                                
                                    
x
+
TestMultiControlPlane/serial/CopyFile (20.72s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/CopyFile
ha_test.go:328: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 status --output json --alsologtostderr -v 5
ha_test.go:328: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 status --output json --alsologtostderr -v 5: (1.078556329s)
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp testdata/cp-test.txt ha-394617:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile635794343/001/cp-test_ha-394617.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617:/home/docker/cp-test.txt ha-394617-m02:/home/docker/cp-test_ha-394617_ha-394617-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m02 "sudo cat /home/docker/cp-test_ha-394617_ha-394617-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617:/home/docker/cp-test.txt ha-394617-m03:/home/docker/cp-test_ha-394617_ha-394617-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m03 "sudo cat /home/docker/cp-test_ha-394617_ha-394617-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617:/home/docker/cp-test.txt ha-394617-m04:/home/docker/cp-test_ha-394617_ha-394617-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m04 "sudo cat /home/docker/cp-test_ha-394617_ha-394617-m04.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp testdata/cp-test.txt ha-394617-m02:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m02:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile635794343/001/cp-test_ha-394617-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m02:/home/docker/cp-test.txt ha-394617:/home/docker/cp-test_ha-394617-m02_ha-394617.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617 "sudo cat /home/docker/cp-test_ha-394617-m02_ha-394617.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m02:/home/docker/cp-test.txt ha-394617-m03:/home/docker/cp-test_ha-394617-m02_ha-394617-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m03 "sudo cat /home/docker/cp-test_ha-394617-m02_ha-394617-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m02:/home/docker/cp-test.txt ha-394617-m04:/home/docker/cp-test_ha-394617-m02_ha-394617-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m04 "sudo cat /home/docker/cp-test_ha-394617-m02_ha-394617-m04.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp testdata/cp-test.txt ha-394617-m03:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m03:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile635794343/001/cp-test_ha-394617-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m03:/home/docker/cp-test.txt ha-394617:/home/docker/cp-test_ha-394617-m03_ha-394617.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617 "sudo cat /home/docker/cp-test_ha-394617-m03_ha-394617.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m03:/home/docker/cp-test.txt ha-394617-m02:/home/docker/cp-test_ha-394617-m03_ha-394617-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m02 "sudo cat /home/docker/cp-test_ha-394617-m03_ha-394617-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m03:/home/docker/cp-test.txt ha-394617-m04:/home/docker/cp-test_ha-394617-m03_ha-394617-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m04 "sudo cat /home/docker/cp-test_ha-394617-m03_ha-394617-m04.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp testdata/cp-test.txt ha-394617-m04:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m04:/home/docker/cp-test.txt /tmp/TestMultiControlPlaneserialCopyFile635794343/001/cp-test_ha-394617-m04.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m04:/home/docker/cp-test.txt ha-394617:/home/docker/cp-test_ha-394617-m04_ha-394617.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617 "sudo cat /home/docker/cp-test_ha-394617-m04_ha-394617.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m04:/home/docker/cp-test.txt ha-394617-m02:/home/docker/cp-test_ha-394617-m04_ha-394617-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m02 "sudo cat /home/docker/cp-test_ha-394617-m04_ha-394617-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 cp ha-394617-m04:/home/docker/cp-test.txt ha-394617-m03:/home/docker/cp-test_ha-394617-m04_ha-394617-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 ssh -n ha-394617-m03 "sudo cat /home/docker/cp-test_ha-394617-m04_ha-394617-m03.txt"
--- PASS: TestMultiControlPlane/serial/CopyFile (20.72s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopSecondaryNode (12.97s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopSecondaryNode
ha_test.go:365: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 node stop m02 --alsologtostderr -v 5
ha_test.go:365: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 node stop m02 --alsologtostderr -v 5: (12.172478009s)
ha_test.go:371: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5
ha_test.go:371: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5: exit status 7 (795.469195ms)

                                                
                                                
-- stdout --
	ha-394617
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-394617-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-394617-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-394617-m04
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1202 19:35:09.423316   92047 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:35:09.423480   92047 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:35:09.423493   92047 out.go:374] Setting ErrFile to fd 2...
	I1202 19:35:09.423499   92047 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:35:09.423752   92047 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:35:09.423945   92047 out.go:368] Setting JSON to false
	I1202 19:35:09.423978   92047 mustload.go:66] Loading cluster: ha-394617
	I1202 19:35:09.424034   92047 notify.go:221] Checking for updates...
	I1202 19:35:09.424440   92047 config.go:182] Loaded profile config "ha-394617": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 19:35:09.424462   92047 status.go:174] checking status of ha-394617 ...
	I1202 19:35:09.425016   92047 cli_runner.go:164] Run: docker container inspect ha-394617 --format={{.State.Status}}
	I1202 19:35:09.445354   92047 status.go:371] ha-394617 host status = "Running" (err=<nil>)
	I1202 19:35:09.445380   92047 host.go:66] Checking if "ha-394617" exists ...
	I1202 19:35:09.445699   92047 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-394617
	I1202 19:35:09.467751   92047 host.go:66] Checking if "ha-394617" exists ...
	I1202 19:35:09.468060   92047 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:35:09.468117   92047 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-394617
	I1202 19:35:09.486521   92047 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32793 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/ha-394617/id_rsa Username:docker}
	I1202 19:35:09.593757   92047 ssh_runner.go:195] Run: systemctl --version
	I1202 19:35:09.600844   92047 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:35:09.614744   92047 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:35:09.688455   92047 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:4 ContainersRunning:3 ContainersPaused:0 ContainersStopped:1 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:62 OomKillDisable:true NGoroutines:72 SystemTime:2025-12-02 19:35:09.678851759 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:35:09.689003   92047 kubeconfig.go:125] found "ha-394617" server: "https://192.168.49.254:8443"
	I1202 19:35:09.689068   92047 api_server.go:166] Checking apiserver status ...
	I1202 19:35:09.689118   92047 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:35:09.702483   92047 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1318/cgroup
	I1202 19:35:09.711620   92047 api_server.go:182] apiserver freezer: "11:freezer:/docker/eb9608132db44a91ad77ff52dbbf3bde7a8af63ac4229808289a65861915d442/kubepods/burstable/podc6d3bd37cf310fcbfc3fc7e3279158f0/2df2e2f8ea93b340a936ff9fb6c114c0afd15b4e07cf996e02fd319d1eec8f89"
	I1202 19:35:09.711696   92047 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/eb9608132db44a91ad77ff52dbbf3bde7a8af63ac4229808289a65861915d442/kubepods/burstable/podc6d3bd37cf310fcbfc3fc7e3279158f0/2df2e2f8ea93b340a936ff9fb6c114c0afd15b4e07cf996e02fd319d1eec8f89/freezer.state
	I1202 19:35:09.720005   92047 api_server.go:204] freezer state: "THAWED"
	I1202 19:35:09.720040   92047 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1202 19:35:09.728416   92047 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1202 19:35:09.728443   92047 status.go:463] ha-394617 apiserver status = Running (err=<nil>)
	I1202 19:35:09.728460   92047 status.go:176] ha-394617 status: &{Name:ha-394617 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1202 19:35:09.728477   92047 status.go:174] checking status of ha-394617-m02 ...
	I1202 19:35:09.728795   92047 cli_runner.go:164] Run: docker container inspect ha-394617-m02 --format={{.State.Status}}
	I1202 19:35:09.746973   92047 status.go:371] ha-394617-m02 host status = "Stopped" (err=<nil>)
	I1202 19:35:09.746993   92047 status.go:384] host is not running, skipping remaining checks
	I1202 19:35:09.746999   92047 status.go:176] ha-394617-m02 status: &{Name:ha-394617-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1202 19:35:09.747020   92047 status.go:174] checking status of ha-394617-m03 ...
	I1202 19:35:09.747353   92047 cli_runner.go:164] Run: docker container inspect ha-394617-m03 --format={{.State.Status}}
	I1202 19:35:09.765462   92047 status.go:371] ha-394617-m03 host status = "Running" (err=<nil>)
	I1202 19:35:09.765488   92047 host.go:66] Checking if "ha-394617-m03" exists ...
	I1202 19:35:09.765804   92047 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-394617-m03
	I1202 19:35:09.784707   92047 host.go:66] Checking if "ha-394617-m03" exists ...
	I1202 19:35:09.785063   92047 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:35:09.785110   92047 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-394617-m03
	I1202 19:35:09.809857   92047 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32803 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/ha-394617-m03/id_rsa Username:docker}
	I1202 19:35:09.914110   92047 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:35:09.930910   92047 kubeconfig.go:125] found "ha-394617" server: "https://192.168.49.254:8443"
	I1202 19:35:09.930937   92047 api_server.go:166] Checking apiserver status ...
	I1202 19:35:09.930979   92047 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:35:09.943332   92047 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1364/cgroup
	I1202 19:35:09.953425   92047 api_server.go:182] apiserver freezer: "11:freezer:/docker/5681572337aa1cafb7b22c2239a3ec423ad386408e3704c5f774a954c4a8d6d9/kubepods/burstable/pod1286fe61c71e78122f13981348489361/288f9139d9ff2e81fb1f45cf7885dab50b2823627b1df56f0d957eaffa856148"
	I1202 19:35:09.953507   92047 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/5681572337aa1cafb7b22c2239a3ec423ad386408e3704c5f774a954c4a8d6d9/kubepods/burstable/pod1286fe61c71e78122f13981348489361/288f9139d9ff2e81fb1f45cf7885dab50b2823627b1df56f0d957eaffa856148/freezer.state
	I1202 19:35:09.962909   92047 api_server.go:204] freezer state: "THAWED"
	I1202 19:35:09.962935   92047 api_server.go:253] Checking apiserver healthz at https://192.168.49.254:8443/healthz ...
	I1202 19:35:09.971274   92047 api_server.go:279] https://192.168.49.254:8443/healthz returned 200:
	ok
	I1202 19:35:09.971303   92047 status.go:463] ha-394617-m03 apiserver status = Running (err=<nil>)
	I1202 19:35:09.971313   92047 status.go:176] ha-394617-m03 status: &{Name:ha-394617-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1202 19:35:09.971329   92047 status.go:174] checking status of ha-394617-m04 ...
	I1202 19:35:09.971653   92047 cli_runner.go:164] Run: docker container inspect ha-394617-m04 --format={{.State.Status}}
	I1202 19:35:09.993105   92047 status.go:371] ha-394617-m04 host status = "Running" (err=<nil>)
	I1202 19:35:09.993136   92047 host.go:66] Checking if "ha-394617-m04" exists ...
	I1202 19:35:09.993631   92047 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" ha-394617-m04
	I1202 19:35:10.017961   92047 host.go:66] Checking if "ha-394617-m04" exists ...
	I1202 19:35:10.018313   92047 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:35:10.018363   92047 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" ha-394617-m04
	I1202 19:35:10.042253   92047 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32808 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/ha-394617-m04/id_rsa Username:docker}
	I1202 19:35:10.146249   92047 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:35:10.161340   92047 status.go:176] ha-394617-m04 status: &{Name:ha-394617-m04 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopSecondaryNode (12.97s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.86s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.86s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartSecondaryNode (14.67s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartSecondaryNode
ha_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 node start m02 --alsologtostderr -v 5
ha_test.go:422: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 node start m02 --alsologtostderr -v 5: (13.016155811s)
ha_test.go:430: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5
ha_test.go:430: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5: (1.509053507s)
ha_test.go:450: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiControlPlane/serial/RestartSecondaryNode (14.67s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.25s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.251080619s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (1.25s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartClusterKeepsNodes (98.97s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartClusterKeepsNodes
ha_test.go:458: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 node list --alsologtostderr -v 5
ha_test.go:464: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 stop --alsologtostderr -v 5
E1202 19:35:46.065608    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:36:03.774013    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:36:03.868719    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:464: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 stop --alsologtostderr -v 5: (37.657439282s)
ha_test.go:469: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 start --wait true --alsologtostderr -v 5
E1202 19:36:31.575340    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:469: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 start --wait true --alsologtostderr -v 5: (1m1.149628631s)
ha_test.go:474: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 node list --alsologtostderr -v 5
--- PASS: TestMultiControlPlane/serial/RestartClusterKeepsNodes (98.97s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeleteSecondaryNode (11.28s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeleteSecondaryNode
ha_test.go:489: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 node delete m03 --alsologtostderr -v 5
ha_test.go:489: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 node delete m03 --alsologtostderr -v 5: (10.278660476s)
ha_test.go:495: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5
ha_test.go:513: (dbg) Run:  kubectl get nodes
ha_test.go:521: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/DeleteSecondaryNode (11.28s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.78s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.78s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopCluster (36.31s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopCluster
ha_test.go:533: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 stop --alsologtostderr -v 5
ha_test.go:533: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 stop --alsologtostderr -v 5: (36.198970078s)
ha_test.go:539: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5
ha_test.go:539: (dbg) Non-zero exit: out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5: exit status 7 (115.667871ms)

                                                
                                                
-- stdout --
	ha-394617
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-394617-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-394617-m04
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1202 19:37:54.227039  106895 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:37:54.227255  106895 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:37:54.227282  106895 out.go:374] Setting ErrFile to fd 2...
	I1202 19:37:54.227301  106895 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:37:54.227593  106895 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:37:54.227831  106895 out.go:368] Setting JSON to false
	I1202 19:37:54.227893  106895 mustload.go:66] Loading cluster: ha-394617
	I1202 19:37:54.227981  106895 notify.go:221] Checking for updates...
	I1202 19:37:54.228387  106895 config.go:182] Loaded profile config "ha-394617": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 19:37:54.228429  106895 status.go:174] checking status of ha-394617 ...
	I1202 19:37:54.229305  106895 cli_runner.go:164] Run: docker container inspect ha-394617 --format={{.State.Status}}
	I1202 19:37:54.247480  106895 status.go:371] ha-394617 host status = "Stopped" (err=<nil>)
	I1202 19:37:54.247504  106895 status.go:384] host is not running, skipping remaining checks
	I1202 19:37:54.247512  106895 status.go:176] ha-394617 status: &{Name:ha-394617 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1202 19:37:54.247541  106895 status.go:174] checking status of ha-394617-m02 ...
	I1202 19:37:54.247863  106895 cli_runner.go:164] Run: docker container inspect ha-394617-m02 --format={{.State.Status}}
	I1202 19:37:54.265134  106895 status.go:371] ha-394617-m02 host status = "Stopped" (err=<nil>)
	I1202 19:37:54.265166  106895 status.go:384] host is not running, skipping remaining checks
	I1202 19:37:54.265175  106895 status.go:176] ha-394617-m02 status: &{Name:ha-394617-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1202 19:37:54.265197  106895 status.go:174] checking status of ha-394617-m04 ...
	I1202 19:37:54.265504  106895 cli_runner.go:164] Run: docker container inspect ha-394617-m04 --format={{.State.Status}}
	I1202 19:37:54.294163  106895 status.go:371] ha-394617-m04 host status = "Stopped" (err=<nil>)
	I1202 19:37:54.294216  106895 status.go:384] host is not running, skipping remaining checks
	I1202 19:37:54.294225  106895 status.go:176] ha-394617-m04 status: &{Name:ha-394617-m04 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiControlPlane/serial/StopCluster (36.31s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartCluster (60.5s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartCluster
ha_test.go:562: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd
E1202 19:38:00.704101    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
ha_test.go:562: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 start --wait true --alsologtostderr -v 5 --driver=docker  --container-runtime=containerd: (59.499761766s)
ha_test.go:568: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5
ha_test.go:586: (dbg) Run:  kubectl get nodes
ha_test.go:594: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiControlPlane/serial/RestartCluster (60.50s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.83s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterClusterRestart
ha_test.go:392: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterClusterRestart (0.83s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddSecondaryNode (90.35s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddSecondaryNode
ha_test.go:607: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 node add --control-plane --alsologtostderr -v 5
ha_test.go:607: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 node add --control-plane --alsologtostderr -v 5: (1m29.179028875s)
ha_test.go:613: (dbg) Run:  out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5
ha_test.go:613: (dbg) Done: out/minikube-linux-arm64 -p ha-394617 status --alsologtostderr -v 5: (1.165960995s)
--- PASS: TestMultiControlPlane/serial/AddSecondaryNode (90.35s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.12s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd
ha_test.go:281: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
ha_test.go:281: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.123692982s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (1.12s)

                                                
                                    
x
+
TestJSONOutput/start/Command (81.52s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-592099 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=containerd
E1202 19:40:46.066105    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:41:03.868439    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p json-output-592099 --output=json --user=testUser --memory=3072 --wait=true --driver=docker  --container-runtime=containerd: (1m21.514200914s)
--- PASS: TestJSONOutput/start/Command (81.52s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.74s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 pause -p json-output-592099 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.74s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.64s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 unpause -p json-output-592099 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.64s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (6.01s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-arm64 stop -p json-output-592099 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-linux-arm64 stop -p json-output-592099 --output=json --user=testUser: (6.005878637s)
--- PASS: TestJSONOutput/stop/Command (6.01s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.25s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-linux-arm64 start -p json-output-error-678285 --memory=3072 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p json-output-error-678285 --memory=3072 --output=json --wait=true --driver=fail: exit status 56 (95.36694ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"19db9dfe-d1be-4193-91b2-a579a53c042a","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-678285] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"44c1b97d-0d2b-4429-b89d-6f7ed01339ee","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=22021"}}
	{"specversion":"1.0","id":"d6468bce-38e7-43b8-ba32-1e7ebeba62d2","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"42d4b82d-bad5-4f20-9d1c-fe5a6bacfa91","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig"}}
	{"specversion":"1.0","id":"0e236694-d4ce-48e3-bb48-4c563aca2944","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube"}}
	{"specversion":"1.0","id":"ebeee4c3-6f95-4035-8e72-2d23a0df4128","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"12007511-a587-4bf9-99d4-43be551c6734","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"bcafce1b-0698-4dbf-89e8-ad72809dabb3","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on linux/arm64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-678285" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p json-output-error-678285
--- PASS: TestErrorJSONOutput (0.25s)

                                                
                                    
x
+
TestKicCustomNetwork/create_custom_network (39.88s)

                                                
                                                
=== RUN   TestKicCustomNetwork/create_custom_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-354622 --network=
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-354622 --network=: (37.549886242s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:175: Cleaning up "docker-network-354622" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-354622
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-354622: (2.304708583s)
--- PASS: TestKicCustomNetwork/create_custom_network (39.88s)

                                                
                                    
x
+
TestKicCustomNetwork/use_default_bridge_network (35.19s)

                                                
                                                
=== RUN   TestKicCustomNetwork/use_default_bridge_network
kic_custom_network_test.go:57: (dbg) Run:  out/minikube-linux-arm64 start -p docker-network-154673 --network=bridge
E1202 19:43:00.704567    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
kic_custom_network_test.go:57: (dbg) Done: out/minikube-linux-arm64 start -p docker-network-154673 --network=bridge: (32.969659131s)
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
helpers_test.go:175: Cleaning up "docker-network-154673" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p docker-network-154673
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p docker-network-154673: (2.184366644s)
--- PASS: TestKicCustomNetwork/use_default_bridge_network (35.19s)

                                                
                                    
x
+
TestKicExistingNetwork (35.66s)

                                                
                                                
=== RUN   TestKicExistingNetwork
I1202 19:43:26.152938    4435 cli_runner.go:164] Run: docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
W1202 19:43:26.170697    4435 cli_runner.go:211] docker network inspect existing-network --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}" returned with exit code 1
I1202 19:43:26.170772    4435 network_create.go:284] running [docker network inspect existing-network] to gather additional debugging logs...
I1202 19:43:26.170793    4435 cli_runner.go:164] Run: docker network inspect existing-network
W1202 19:43:26.186710    4435 cli_runner.go:211] docker network inspect existing-network returned with exit code 1
I1202 19:43:26.186738    4435 network_create.go:287] error running [docker network inspect existing-network]: docker network inspect existing-network: exit status 1
stdout:
[]

                                                
                                                
stderr:
Error response from daemon: network existing-network not found
I1202 19:43:26.186753    4435 network_create.go:289] output of [docker network inspect existing-network]: -- stdout --
[]

                                                
                                                
-- /stdout --
** stderr ** 
Error response from daemon: network existing-network not found

                                                
                                                
** /stderr **
I1202 19:43:26.186872    4435 cli_runner.go:164] Run: docker network inspect bridge --format "{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{range .IPAM.Config}}{{.Subnet}}{{end}}","Gateway": "{{range .IPAM.Config}}{{.Gateway}}{{end}}","MTU": {{if (index .Options "com.docker.network.driver.mtu")}}{{(index .Options "com.docker.network.driver.mtu")}}{{else}}0{{end}}, "ContainerIPs": [{{range $k,$v := .Containers }}"{{$v.IPv4Address}}",{{end}}]}"
I1202 19:43:26.203586    4435 network.go:211] skipping subnet 192.168.49.0/24 that is taken: &{IP:192.168.49.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.49.0/24 Gateway:192.168.49.1 ClientMin:192.168.49.2 ClientMax:192.168.49.254 Broadcast:192.168.49.255 IsPrivate:true Interface:{IfaceName:br-af5a3a112c8c IfaceIPv4:192.168.49.1 IfaceMTU:1500 IfaceMAC:da:13:76:29:c4:21} reservation:<nil>}
I1202 19:43:26.203856    4435 network.go:206] using free private subnet 192.168.58.0/24: &{IP:192.168.58.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.58.0/24 Gateway:192.168.58.1 ClientMin:192.168.58.2 ClientMax:192.168.58.254 Broadcast:192.168.58.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0x4001bf43a0}
I1202 19:43:26.203880    4435 network_create.go:124] attempt to create docker network existing-network 192.168.58.0/24 with gateway 192.168.58.1 and MTU of 1500 ...
I1202 19:43:26.203930    4435 cli_runner.go:164] Run: docker network create --driver=bridge --subnet=192.168.58.0/24 --gateway=192.168.58.1 -o --ip-masq -o --icc -o com.docker.network.driver.mtu=1500 --label=created_by.minikube.sigs.k8s.io=true --label=name.minikube.sigs.k8s.io=existing-network existing-network
I1202 19:43:26.259514    4435 network_create.go:108] docker network existing-network 192.168.58.0/24 created
kic_custom_network_test.go:150: (dbg) Run:  docker network ls --format {{.Name}}
kic_custom_network_test.go:93: (dbg) Run:  out/minikube-linux-arm64 start -p existing-network-544884 --network=existing-network
kic_custom_network_test.go:93: (dbg) Done: out/minikube-linux-arm64 start -p existing-network-544884 --network=existing-network: (33.354297025s)
helpers_test.go:175: Cleaning up "existing-network-544884" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p existing-network-544884
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p existing-network-544884: (2.16146851s)
I1202 19:44:01.792107    4435 cli_runner.go:164] Run: docker network ls --filter=label=existing-network --format {{.Name}}
--- PASS: TestKicExistingNetwork (35.66s)

                                                
                                    
x
+
TestKicCustomSubnet (34.13s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:112: (dbg) Run:  out/minikube-linux-arm64 start -p custom-subnet-865888 --subnet=192.168.60.0/24
kic_custom_network_test.go:112: (dbg) Done: out/minikube-linux-arm64 start -p custom-subnet-865888 --subnet=192.168.60.0/24: (31.822566342s)
kic_custom_network_test.go:161: (dbg) Run:  docker network inspect custom-subnet-865888 --format "{{(index .IPAM.Config 0).Subnet}}"
helpers_test.go:175: Cleaning up "custom-subnet-865888" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p custom-subnet-865888
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p custom-subnet-865888: (2.285944323s)
--- PASS: TestKicCustomSubnet (34.13s)

                                                
                                    
x
+
TestKicStaticIP (35.99s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:132: (dbg) Run:  out/minikube-linux-arm64 start -p static-ip-808818 --static-ip=192.168.200.200
kic_custom_network_test.go:132: (dbg) Done: out/minikube-linux-arm64 start -p static-ip-808818 --static-ip=192.168.200.200: (33.545486564s)
kic_custom_network_test.go:138: (dbg) Run:  out/minikube-linux-arm64 -p static-ip-808818 ip
helpers_test.go:175: Cleaning up "static-ip-808818" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p static-ip-808818
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p static-ip-808818: (2.299661295s)
--- PASS: TestKicStaticIP (35.99s)

                                                
                                    
x
+
TestMainNoArgs (0.05s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:70: (dbg) Run:  out/minikube-linux-arm64
--- PASS: TestMainNoArgs (0.05s)

                                                
                                    
x
+
TestMinikubeProfile (71.88s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p first-598870 --driver=docker  --container-runtime=containerd
E1202 19:45:29.136482    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p first-598870 --driver=docker  --container-runtime=containerd: (31.965275227s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-arm64 start -p second-601414 --driver=docker  --container-runtime=containerd
E1202 19:45:46.065882    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:46:03.868686    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-arm64 start -p second-601414 --driver=docker  --container-runtime=containerd: (33.889421261s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile first-598870
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-arm64 profile second-601414
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-arm64 profile list -ojson
helpers_test.go:175: Cleaning up "second-601414" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p second-601414
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p second-601414: (2.100812985s)
helpers_test.go:175: Cleaning up "first-598870" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p first-598870
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p first-598870: (2.391969367s)
--- PASS: TestMinikubeProfile (71.88s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (8.16s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-1-636693 --memory=3072 --mount-string /tmp/TestMountStartserial1348041936/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-1-636693 --memory=3072 --mount-string /tmp/TestMountStartserial1348041936/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd: (7.16291995s)
--- PASS: TestMountStart/serial/StartWithMountFirst (8.16s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-1-636693 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountFirst (0.28s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (8.54s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:118: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-638588 --memory=3072 --mount-string /tmp/TestMountStartserial1348041936/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd
mount_start_test.go:118: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-638588 --memory=3072 --mount-string /tmp/TestMountStartserial1348041936/001:/minikube-host --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=docker  --container-runtime=containerd: (7.540313089s)
--- PASS: TestMountStart/serial/StartWithMountSecond (8.54s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-638588 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountSecond (0.27s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (1.72s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p mount-start-1-636693 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p mount-start-1-636693 --alsologtostderr -v=5: (1.722900539s)
--- PASS: TestMountStart/serial/DeleteFirst (1.72s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-638588 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                    
x
+
TestMountStart/serial/Stop (1.28s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:196: (dbg) Run:  out/minikube-linux-arm64 stop -p mount-start-2-638588
mount_start_test.go:196: (dbg) Done: out/minikube-linux-arm64 stop -p mount-start-2-638588: (1.28214445s)
--- PASS: TestMountStart/serial/Stop (1.28s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (7.94s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:207: (dbg) Run:  out/minikube-linux-arm64 start -p mount-start-2-638588
mount_start_test.go:207: (dbg) Done: out/minikube-linux-arm64 start -p mount-start-2-638588: (6.938602936s)
--- PASS: TestMountStart/serial/RestartStopped (7.94s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.29s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:134: (dbg) Run:  out/minikube-linux-arm64 -p mount-start-2-638588 ssh -- ls /minikube-host
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.29s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (106.09s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-265336 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd
E1202 19:47:26.937371    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:48:00.703797    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:96: (dbg) Done: out/minikube-linux-arm64 start -p multinode-265336 --wait=true --memory=3072 --nodes=2 -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd: (1m45.515729985s)
multinode_test.go:102: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (106.09s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (5.01s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:498: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-linux-arm64 kubectl -p multinode-265336 -- rollout status deployment/busybox: (3.064562297s)
multinode_test.go:505: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- exec busybox-7b57f96db7-fx9c2 -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- exec busybox-7b57f96db7-wd67f -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- exec busybox-7b57f96db7-fx9c2 -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- exec busybox-7b57f96db7-wd67f -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- exec busybox-7b57f96db7-fx9c2 -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- exec busybox-7b57f96db7-wd67f -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (5.01s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (1.07s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- exec busybox-7b57f96db7-fx9c2 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- exec busybox-7b57f96db7-fx9c2 -- sh -c "ping -c 1 192.168.67.1"
multinode_test.go:572: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- exec busybox-7b57f96db7-wd67f -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-arm64 kubectl -p multinode-265336 -- exec busybox-7b57f96db7-wd67f -- sh -c "ping -c 1 192.168.67.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (1.07s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (57.52s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-265336 -v=5 --alsologtostderr
multinode_test.go:121: (dbg) Done: out/minikube-linux-arm64 node add -p multinode-265336 -v=5 --alsologtostderr: (56.789463859s)
multinode_test.go:127: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (57.52s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.1s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-265336 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.10s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.75s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.75s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (10.71s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 status --output json --alsologtostderr
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp testdata/cp-test.txt multinode-265336:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp multinode-265336:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile501261924/001/cp-test_multinode-265336.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp multinode-265336:/home/docker/cp-test.txt multinode-265336-m02:/home/docker/cp-test_multinode-265336_multinode-265336-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m02 "sudo cat /home/docker/cp-test_multinode-265336_multinode-265336-m02.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp multinode-265336:/home/docker/cp-test.txt multinode-265336-m03:/home/docker/cp-test_multinode-265336_multinode-265336-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m03 "sudo cat /home/docker/cp-test_multinode-265336_multinode-265336-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp testdata/cp-test.txt multinode-265336-m02:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp multinode-265336-m02:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile501261924/001/cp-test_multinode-265336-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp multinode-265336-m02:/home/docker/cp-test.txt multinode-265336:/home/docker/cp-test_multinode-265336-m02_multinode-265336.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336 "sudo cat /home/docker/cp-test_multinode-265336-m02_multinode-265336.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp multinode-265336-m02:/home/docker/cp-test.txt multinode-265336-m03:/home/docker/cp-test_multinode-265336-m02_multinode-265336-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m03 "sudo cat /home/docker/cp-test_multinode-265336-m02_multinode-265336-m03.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp testdata/cp-test.txt multinode-265336-m03:/home/docker/cp-test.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp multinode-265336-m03:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile501261924/001/cp-test_multinode-265336-m03.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp multinode-265336-m03:/home/docker/cp-test.txt multinode-265336:/home/docker/cp-test_multinode-265336-m03_multinode-265336.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336 "sudo cat /home/docker/cp-test_multinode-265336-m03_multinode-265336.txt"
helpers_test.go:573: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 cp multinode-265336-m03:/home/docker/cp-test.txt multinode-265336-m02:/home/docker/cp-test_multinode-265336-m03_multinode-265336-m02.txt
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:551: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 ssh -n multinode-265336-m02 "sudo cat /home/docker/cp-test_multinode-265336-m03_multinode-265336-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (10.71s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.47s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-linux-arm64 -p multinode-265336 node stop m03: (1.329061705s)
multinode_test.go:254: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-265336 status: exit status 7 (566.415828ms)

                                                
                                                
-- stdout --
	multinode-265336
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-265336-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-265336-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-265336 status --alsologtostderr: exit status 7 (575.559266ms)

                                                
                                                
-- stdout --
	multinode-265336
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-265336-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-265336-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1202 19:49:57.713692  160183 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:49:57.713886  160183 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:49:57.713913  160183 out.go:374] Setting ErrFile to fd 2...
	I1202 19:49:57.713933  160183 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:49:57.714211  160183 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:49:57.714424  160183 out.go:368] Setting JSON to false
	I1202 19:49:57.714472  160183 mustload.go:66] Loading cluster: multinode-265336
	I1202 19:49:57.714602  160183 notify.go:221] Checking for updates...
	I1202 19:49:57.715019  160183 config.go:182] Loaded profile config "multinode-265336": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 19:49:57.715054  160183 status.go:174] checking status of multinode-265336 ...
	I1202 19:49:57.715778  160183 cli_runner.go:164] Run: docker container inspect multinode-265336 --format={{.State.Status}}
	I1202 19:49:57.740219  160183 status.go:371] multinode-265336 host status = "Running" (err=<nil>)
	I1202 19:49:57.740240  160183 host.go:66] Checking if "multinode-265336" exists ...
	I1202 19:49:57.740775  160183 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-265336
	I1202 19:49:57.760200  160183 host.go:66] Checking if "multinode-265336" exists ...
	I1202 19:49:57.760712  160183 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:49:57.760776  160183 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-265336
	I1202 19:49:57.789613  160183 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32915 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/multinode-265336/id_rsa Username:docker}
	I1202 19:49:57.897828  160183 ssh_runner.go:195] Run: systemctl --version
	I1202 19:49:57.904406  160183 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:49:57.917915  160183 cli_runner.go:164] Run: docker system info --format "{{json .}}"
	I1202 19:49:57.993375  160183 info.go:266] docker info: {ID:EOU5:DNGX:XN6V:L2FZ:UXRM:5TWK:EVUR:KC2F:GT7Z:Y4O4:GB77:5PD3 Containers:3 ContainersRunning:2 ContainersPaused:0 ContainersStopped:1 Images:3 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Using metacopy false] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:false BridgeNfIP6Tables:false Debug:false NFd:49 OomKillDisable:true NGoroutines:62 SystemTime:2025-12-02 19:49:57.982001463 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1084-aws OperatingSystem:Ubuntu 20.04.6 LTS OSType:linux Architecture:a
arch64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[::1/128 127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:2 MemTotal:8214839296 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ip-172-31-31-251 Labels:[] ExperimentalBuild:false ServerVersion:28.1.1 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:05044ec0a9a75232cad458027ca83437aae3f4da Expected:} RuncCommit:{ID:v1.2.5-0-g59923ef Expected:} InitCommit:{ID:de40ad0 Expected:} SecurityOptions:[name=apparmor name=seccomp,profile=builtin] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx P
ath:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.23.0] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.35.1]] Warnings:<nil>}}
	I1202 19:49:57.993933  160183 kubeconfig.go:125] found "multinode-265336" server: "https://192.168.67.2:8443"
	I1202 19:49:57.993972  160183 api_server.go:166] Checking apiserver status ...
	I1202 19:49:57.994020  160183 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I1202 19:49:58.010823  160183 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1384/cgroup
	I1202 19:49:58.020411  160183 api_server.go:182] apiserver freezer: "11:freezer:/docker/a8a0517913412d54b9c9dd050af6f58bce7892c9637937cddbd1cdd0fb22fb7a/kubepods/burstable/pode44f421b3dc65077d367a551944bb782/c7e9b71b277227f0fb13a50494a3bd907eff2e346fa018f6b501abe265e00207"
	I1202 19:49:58.020505  160183 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/a8a0517913412d54b9c9dd050af6f58bce7892c9637937cddbd1cdd0fb22fb7a/kubepods/burstable/pode44f421b3dc65077d367a551944bb782/c7e9b71b277227f0fb13a50494a3bd907eff2e346fa018f6b501abe265e00207/freezer.state
	I1202 19:49:58.028887  160183 api_server.go:204] freezer state: "THAWED"
	I1202 19:49:58.028920  160183 api_server.go:253] Checking apiserver healthz at https://192.168.67.2:8443/healthz ...
	I1202 19:49:58.038648  160183 api_server.go:279] https://192.168.67.2:8443/healthz returned 200:
	ok
	I1202 19:49:58.038683  160183 status.go:463] multinode-265336 apiserver status = Running (err=<nil>)
	I1202 19:49:58.038695  160183 status.go:176] multinode-265336 status: &{Name:multinode-265336 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1202 19:49:58.038712  160183 status.go:174] checking status of multinode-265336-m02 ...
	I1202 19:49:58.039065  160183 cli_runner.go:164] Run: docker container inspect multinode-265336-m02 --format={{.State.Status}}
	I1202 19:49:58.057527  160183 status.go:371] multinode-265336-m02 host status = "Running" (err=<nil>)
	I1202 19:49:58.057554  160183 host.go:66] Checking if "multinode-265336-m02" exists ...
	I1202 19:49:58.057928  160183 cli_runner.go:164] Run: docker container inspect -f "{{range .NetworkSettings.Networks}}{{.IPAddress}},{{.GlobalIPv6Address}}{{end}}" multinode-265336-m02
	I1202 19:49:58.076242  160183 host.go:66] Checking if "multinode-265336-m02" exists ...
	I1202 19:49:58.076694  160183 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I1202 19:49:58.076743  160183 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" multinode-265336-m02
	I1202 19:49:58.094828  160183 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:32920 SSHKeyPath:/home/jenkins/minikube-integration/22021-2487/.minikube/machines/multinode-265336-m02/id_rsa Username:docker}
	I1202 19:49:58.202150  160183 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I1202 19:49:58.215714  160183 status.go:176] multinode-265336-m02 status: &{Name:multinode-265336-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I1202 19:49:58.215748  160183 status.go:174] checking status of multinode-265336-m03 ...
	I1202 19:49:58.216093  160183 cli_runner.go:164] Run: docker container inspect multinode-265336-m03 --format={{.State.Status}}
	I1202 19:49:58.232955  160183 status.go:371] multinode-265336-m03 host status = "Stopped" (err=<nil>)
	I1202 19:49:58.232977  160183 status.go:384] host is not running, skipping remaining checks
	I1202 19:49:58.232985  160183 status.go:176] multinode-265336-m03 status: &{Name:multinode-265336-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.47s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (8.07s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 node start m03 -v=5 --alsologtostderr
multinode_test.go:282: (dbg) Done: out/minikube-linux-arm64 -p multinode-265336 node start m03 -v=5 --alsologtostderr: (7.24299774s)
multinode_test.go:290: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 status -v=5 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (8.07s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (75.57s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-265336
multinode_test.go:321: (dbg) Run:  out/minikube-linux-arm64 stop -p multinode-265336
multinode_test.go:321: (dbg) Done: out/minikube-linux-arm64 stop -p multinode-265336: (25.20694017s)
multinode_test.go:326: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-265336 --wait=true -v=5 --alsologtostderr
E1202 19:50:46.066079    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:51:03.868782    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:326: (dbg) Done: out/minikube-linux-arm64 start -p multinode-265336 --wait=true -v=5 --alsologtostderr: (50.248409265s)
multinode_test.go:331: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-265336
--- PASS: TestMultiNode/serial/RestartKeepsNodes (75.57s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (5.76s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 node delete m03
multinode_test.go:416: (dbg) Done: out/minikube-linux-arm64 -p multinode-265336 node delete m03: (4.969284929s)
multinode_test.go:422: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 status --alsologtostderr
multinode_test.go:436: (dbg) Run:  kubectl get nodes
multinode_test.go:444: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (5.76s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (24.11s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 stop
multinode_test.go:345: (dbg) Done: out/minikube-linux-arm64 -p multinode-265336 stop: (23.941657323s)
multinode_test.go:351: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-265336 status: exit status 7 (85.732581ms)

                                                
                                                
-- stdout --
	multinode-265336
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-265336-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-linux-arm64 -p multinode-265336 status --alsologtostderr: exit status 7 (86.824909ms)

                                                
                                                
-- stdout --
	multinode-265336
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-265336-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I1202 19:51:51.712671  168978 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:51:51.712836  168978 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:51:51.712867  168978 out.go:374] Setting ErrFile to fd 2...
	I1202 19:51:51.712888  168978 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:51:51.713289  168978 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:51:51.713541  168978 out.go:368] Setting JSON to false
	I1202 19:51:51.713596  168978 mustload.go:66] Loading cluster: multinode-265336
	I1202 19:51:51.714270  168978 config.go:182] Loaded profile config "multinode-265336": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 19:51:51.714311  168978 status.go:174] checking status of multinode-265336 ...
	I1202 19:51:51.715072  168978 cli_runner.go:164] Run: docker container inspect multinode-265336 --format={{.State.Status}}
	I1202 19:51:51.715427  168978 notify.go:221] Checking for updates...
	I1202 19:51:51.734075  168978 status.go:371] multinode-265336 host status = "Stopped" (err=<nil>)
	I1202 19:51:51.734095  168978 status.go:384] host is not running, skipping remaining checks
	I1202 19:51:51.734101  168978 status.go:176] multinode-265336 status: &{Name:multinode-265336 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I1202 19:51:51.734133  168978 status.go:174] checking status of multinode-265336-m02 ...
	I1202 19:51:51.734460  168978 cli_runner.go:164] Run: docker container inspect multinode-265336-m02 --format={{.State.Status}}
	I1202 19:51:51.751956  168978 status.go:371] multinode-265336-m02 host status = "Stopped" (err=<nil>)
	I1202 19:51:51.751978  168978 status.go:384] host is not running, skipping remaining checks
	I1202 19:51:51.751987  168978 status.go:176] multinode-265336-m02 status: &{Name:multinode-265336-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (24.11s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (51.74s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-265336 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd
multinode_test.go:376: (dbg) Done: out/minikube-linux-arm64 start -p multinode-265336 --wait=true -v=5 --alsologtostderr --driver=docker  --container-runtime=containerd: (51.033428568s)
multinode_test.go:382: (dbg) Run:  out/minikube-linux-arm64 -p multinode-265336 status --alsologtostderr
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (51.74s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (39.7s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-linux-arm64 node list -p multinode-265336
multinode_test.go:464: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-265336-m02 --driver=docker  --container-runtime=containerd
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p multinode-265336-m02 --driver=docker  --container-runtime=containerd: exit status 14 (101.246996ms)

                                                
                                                
-- stdout --
	* [multinode-265336-m02] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22021
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-265336-m02' is duplicated with machine name 'multinode-265336-m02' in profile 'multinode-265336'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-linux-arm64 start -p multinode-265336-m03 --driver=docker  --container-runtime=containerd
E1202 19:52:43.775867    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:53:00.705797    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
multinode_test.go:472: (dbg) Done: out/minikube-linux-arm64 start -p multinode-265336-m03 --driver=docker  --container-runtime=containerd: (37.110531718s)
multinode_test.go:479: (dbg) Run:  out/minikube-linux-arm64 node add -p multinode-265336
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-linux-arm64 node add -p multinode-265336: exit status 80 (342.426459ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-265336 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-265336-m03 already exists in multinode-265336-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-linux-arm64 delete -p multinode-265336-m03
multinode_test.go:484: (dbg) Done: out/minikube-linux-arm64 delete -p multinode-265336-m03: (2.093881014s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (39.70s)

                                                
                                    
x
+
TestPreload (125.34s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:41: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-586415 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd
preload_test.go:41: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-586415 --memory=3072 --alsologtostderr --wait=true --preload=false --driver=docker  --container-runtime=containerd: (1m0.460933595s)
preload_test.go:49: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-586415 image pull gcr.io/k8s-minikube/busybox
preload_test.go:49: (dbg) Done: out/minikube-linux-arm64 -p test-preload-586415 image pull gcr.io/k8s-minikube/busybox: (2.338652572s)
preload_test.go:55: (dbg) Run:  out/minikube-linux-arm64 stop -p test-preload-586415
preload_test.go:55: (dbg) Done: out/minikube-linux-arm64 stop -p test-preload-586415: (5.95068551s)
preload_test.go:63: (dbg) Run:  out/minikube-linux-arm64 start -p test-preload-586415 --preload=true --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=containerd
preload_test.go:63: (dbg) Done: out/minikube-linux-arm64 start -p test-preload-586415 --preload=true --alsologtostderr -v=1 --wait=true --driver=docker  --container-runtime=containerd: (53.946227122s)
preload_test.go:68: (dbg) Run:  out/minikube-linux-arm64 -p test-preload-586415 image list
helpers_test.go:175: Cleaning up "test-preload-586415" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p test-preload-586415
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p test-preload-586415: (2.39005359s)
--- PASS: TestPreload (125.34s)

                                                
                                    
x
+
TestScheduledStopUnix (108.11s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-linux-arm64 start -p scheduled-stop-738511 --memory=3072 --driver=docker  --container-runtime=containerd
E1202 19:55:46.065865    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 19:56:03.868498    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
scheduled_stop_test.go:128: (dbg) Done: out/minikube-linux-arm64 start -p scheduled-stop-738511 --memory=3072 --driver=docker  --container-runtime=containerd: (31.708475956s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-738511 --schedule 5m -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1202 19:56:04.650455  184845 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:56:04.650607  184845 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:56:04.650643  184845 out.go:374] Setting ErrFile to fd 2...
	I1202 19:56:04.650657  184845 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:56:04.650926  184845 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:56:04.651188  184845 out.go:368] Setting JSON to false
	I1202 19:56:04.651302  184845 mustload.go:66] Loading cluster: scheduled-stop-738511
	I1202 19:56:04.651659  184845 config.go:182] Loaded profile config "scheduled-stop-738511": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 19:56:04.651739  184845 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/config.json ...
	I1202 19:56:04.651918  184845 mustload.go:66] Loading cluster: scheduled-stop-738511
	I1202 19:56:04.652047  184845 config.go:182] Loaded profile config "scheduled-stop-738511": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2

                                                
                                                
** /stderr **
scheduled_stop_test.go:204: (dbg) Run:  out/minikube-linux-arm64 status --format={{.TimeToStop}} -p scheduled-stop-738511 -n scheduled-stop-738511
scheduled_stop_test.go:172: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-738511 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1202 19:56:05.142640  184935 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:56:05.144517  184935 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:56:05.144537  184935 out.go:374] Setting ErrFile to fd 2...
	I1202 19:56:05.144546  184935 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:56:05.144993  184935 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:56:05.145338  184935 out.go:368] Setting JSON to false
	I1202 19:56:05.146953  184935 daemonize_unix.go:73] killing process 184862 as it is an old scheduled stop
	I1202 19:56:05.147049  184935 mustload.go:66] Loading cluster: scheduled-stop-738511
	I1202 19:56:05.147498  184935 config.go:182] Loaded profile config "scheduled-stop-738511": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 19:56:05.147583  184935 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/config.json ...
	I1202 19:56:05.147771  184935 mustload.go:66] Loading cluster: scheduled-stop-738511
	I1202 19:56:05.147895  184935 config.go:182] Loaded profile config "scheduled-stop-738511": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
I1202 19:56:05.155081    4435 retry.go:31] will retry after 148.515µs: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.156204    4435 retry.go:31] will retry after 189.361µs: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.157374    4435 retry.go:31] will retry after 131.558µs: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.158497    4435 retry.go:31] will retry after 355.099µs: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.159881    4435 retry.go:31] will retry after 322.605µs: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.161040    4435 retry.go:31] will retry after 869.05µs: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.162134    4435 retry.go:31] will retry after 1.530528ms: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.164409    4435 retry.go:31] will retry after 1.818927ms: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.167179    4435 retry.go:31] will retry after 3.667009ms: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.171458    4435 retry.go:31] will retry after 4.788247ms: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.176712    4435 retry.go:31] will retry after 8.252017ms: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.185960    4435 retry.go:31] will retry after 7.478268ms: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.194217    4435 retry.go:31] will retry after 10.172509ms: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.205539    4435 retry.go:31] will retry after 13.570235ms: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.219747    4435 retry.go:31] will retry after 19.259271ms: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
I1202 19:56:05.239990    4435 retry.go:31] will retry after 31.990933ms: open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/pid: no such file or directory
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-738511 --cancel-scheduled
minikube stop output:

                                                
                                                
-- stdout --
	* All existing scheduled stops cancelled

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-738511 -n scheduled-stop-738511
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-738511
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-arm64 stop -p scheduled-stop-738511 --schedule 15s -v=5 --alsologtostderr
minikube stop output:

                                                
                                                
** stderr ** 
	I1202 19:56:31.084800  185430 out.go:360] Setting OutFile to fd 1 ...
	I1202 19:56:31.084990  185430 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:56:31.085002  185430 out.go:374] Setting ErrFile to fd 2...
	I1202 19:56:31.085007  185430 out.go:408] TERM=,COLORTERM=, which probably does not support color
	I1202 19:56:31.085260  185430 root.go:338] Updating PATH: /home/jenkins/minikube-integration/22021-2487/.minikube/bin
	I1202 19:56:31.085536  185430 out.go:368] Setting JSON to false
	I1202 19:56:31.085642  185430 mustload.go:66] Loading cluster: scheduled-stop-738511
	I1202 19:56:31.086066  185430 config.go:182] Loaded profile config "scheduled-stop-738511": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2
	I1202 19:56:31.086151  185430 profile.go:143] Saving config to /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/scheduled-stop-738511/config.json ...
	I1202 19:56:31.086367  185430 mustload.go:66] Loading cluster: scheduled-stop-738511
	I1202 19:56:31.086491  185430 config.go:182] Loaded profile config "scheduled-stop-738511": Driver=docker, ContainerRuntime=containerd, KubernetesVersion=v1.34.2

                                                
                                                
** /stderr **
scheduled_stop_test.go:172: signal error was:  os: process already finished
scheduled_stop_test.go:218: (dbg) Run:  out/minikube-linux-arm64 status -p scheduled-stop-738511
scheduled_stop_test.go:218: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p scheduled-stop-738511: exit status 7 (71.086588ms)

                                                
                                                
-- stdout --
	scheduled-stop-738511
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: (dbg) Run:  out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-738511 -n scheduled-stop-738511
scheduled_stop_test.go:189: (dbg) Non-zero exit: out/minikube-linux-arm64 status --format={{.Host}} -p scheduled-stop-738511 -n scheduled-stop-738511: exit status 7 (74.103296ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:189: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-738511" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p scheduled-stop-738511
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p scheduled-stop-738511: (4.755840885s)
--- PASS: TestScheduledStopUnix (108.11s)

                                                
                                    
x
+
TestInsufficientStorage (12.55s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:50: (dbg) Run:  out/minikube-linux-arm64 start -p insufficient-storage-282691 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=containerd
status_test.go:50: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p insufficient-storage-282691 --memory=3072 --output=json --wait=true --driver=docker  --container-runtime=containerd: exit status 26 (9.980028176s)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"1d1b89a6-986f-446c-b4b5-cdae9de83217","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[insufficient-storage-282691] minikube v1.37.0 on Ubuntu 20.04 (arm64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"088b5422-1351-40ac-8c0b-fc2ba6870192","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=22021"}}
	{"specversion":"1.0","id":"9db48b4d-d7db-46a7-b7c3-b5a2a65fd3e2","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"7db402ec-9d7a-4076-8c07-15bd2f9ff762","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig"}}
	{"specversion":"1.0","id":"02069fcd-1a30-429d-bbd8-ca3afcd04492","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube"}}
	{"specversion":"1.0","id":"3f447328-3ed4-4471-a560-4124ba5fc085","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-arm64"}}
	{"specversion":"1.0","id":"0fcb591f-a6c2-44ee-9059-d1ae62ce7faa","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"3e55884a-e888-48c3-8020-766658921917","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_STORAGE_CAPACITY=100"}}
	{"specversion":"1.0","id":"24af91e4-e0eb-4db9-b1e7-80c834c621f7","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_TEST_AVAILABLE_STORAGE=19"}}
	{"specversion":"1.0","id":"97dcd882-4a80-4cc1-85e3-d4c35799d8be","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"1","message":"Using the docker driver based on user configuration","name":"Selecting Driver","totalsteps":"19"}}
	{"specversion":"1.0","id":"ef577629-b1e2-4d56-b7f7-96c31c3a63a7","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"Using Docker driver with root privileges"}}
	{"specversion":"1.0","id":"501ebbd7-271c-423c-8647-9f9a2d8d7433","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"3","message":"Starting \"insufficient-storage-282691\" primary control-plane node in \"insufficient-storage-282691\" cluster","name":"Starting Node","totalsteps":"19"}}
	{"specversion":"1.0","id":"bf8f5b07-0b64-4b5d-a845-f9779c7196eb","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"5","message":"Pulling base image v0.0.48-1764169655-21974 ...","name":"Pulling Base Image","totalsteps":"19"}}
	{"specversion":"1.0","id":"41863544-75a1-462e-95cf-5abb788631aa","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"8","message":"Creating docker container (CPUs=2, Memory=3072MB) ...","name":"Creating Container","totalsteps":"19"}}
	{"specversion":"1.0","id":"de556c44-7c88-4e9d-85f5-82f4cc09c627","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"Try one or more of the following to free up space on the device:\n\n\t\t\t1. Run \"docker system prune\" to remove unused Docker data (optionally with \"-a\")\n\t\t\t2. Increase the storage allocated to Docker for Desktop by clicking on:\n\t\t\t\tDocker icon \u003e Preferences \u003e Resources \u003e Disk Image Size\n\t\t\t3. Run \"minikube ssh -- docker system prune\" if using the Docker container runtime","exitcode":"26","issues":"https://github.com/kubernetes/minikube/issues/9024","message":"Docker is out of disk space! (/var is at 100% of capacity). You can pass '--force' to skip this check.","name":"RSRC_DOCKER_STORAGE","url":""}}

                                                
                                                
-- /stdout --
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-282691 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-282691 --output=json --layout=cluster: exit status 7 (315.307931ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-282691","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","Step":"Creating Container","StepDetail":"Creating docker container (CPUs=2, Memory=3072MB) ...","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-282691","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1202 19:57:31.272245  187052 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-282691" does not appear in /home/jenkins/minikube-integration/22021-2487/kubeconfig

                                                
                                                
** /stderr **
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p insufficient-storage-282691 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p insufficient-storage-282691 --output=json --layout=cluster: exit status 7 (310.887514ms)

                                                
                                                
-- stdout --
	{"Name":"insufficient-storage-282691","StatusCode":507,"StatusName":"InsufficientStorage","StatusDetail":"/var is almost out of disk space","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":500,"StatusName":"Error"}},"Nodes":[{"Name":"insufficient-storage-282691","StatusCode":507,"StatusName":"InsufficientStorage","Components":{"apiserver":{"Name":"apiserver","StatusCode":405,"StatusName":"Stopped"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
** stderr ** 
	E1202 19:57:31.584050  187117 status.go:458] kubeconfig endpoint: get endpoint: "insufficient-storage-282691" does not appear in /home/jenkins/minikube-integration/22021-2487/kubeconfig
	E1202 19:57:31.594122  187117 status.go:258] unable to read event log: stat: stat /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/insufficient-storage-282691/events.json: no such file or directory

                                                
                                                
** /stderr **
helpers_test.go:175: Cleaning up "insufficient-storage-282691" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p insufficient-storage-282691
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p insufficient-storage-282691: (1.942293697s)
--- PASS: TestInsufficientStorage (12.55s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (323.11s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /tmp/minikube-v1.35.0.3833423384 start -p running-upgrade-516082 --memory=3072 --vm-driver=docker  --container-runtime=containerd
E1202 20:06:03.868961    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:120: (dbg) Done: /tmp/minikube-v1.35.0.3833423384 start -p running-upgrade-516082 --memory=3072 --vm-driver=docker  --container-runtime=containerd: (33.501462011s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-linux-arm64 start -p running-upgrade-516082 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
E1202 20:08:00.703811    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:09:23.779428    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:10:46.065611    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:11:03.868494    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:130: (dbg) Done: out/minikube-linux-arm64 start -p running-upgrade-516082 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (4m36.3751917s)
helpers_test.go:175: Cleaning up "running-upgrade-516082" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p running-upgrade-516082
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p running-upgrade-516082: (1.993622545s)
--- PASS: TestRunningBinaryUpgrade (323.11s)

                                                
                                    
x
+
TestMissingContainerUpgrade (175.24s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
=== PAUSE TestMissingContainerUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestMissingContainerUpgrade
version_upgrade_test.go:309: (dbg) Run:  /tmp/minikube-v1.35.0.471902243 start -p missing-upgrade-824445 --memory=3072 --driver=docker  --container-runtime=containerd
E1202 19:58:00.703394    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:309: (dbg) Done: /tmp/minikube-v1.35.0.471902243 start -p missing-upgrade-824445 --memory=3072 --driver=docker  --container-runtime=containerd: (1m16.456394327s)
version_upgrade_test.go:318: (dbg) Run:  docker stop missing-upgrade-824445
version_upgrade_test.go:323: (dbg) Run:  docker rm missing-upgrade-824445
version_upgrade_test.go:329: (dbg) Run:  out/minikube-linux-arm64 start -p missing-upgrade-824445 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
version_upgrade_test.go:329: (dbg) Done: out/minikube-linux-arm64 start -p missing-upgrade-824445 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (1m33.361063954s)
helpers_test.go:175: Cleaning up "missing-upgrade-824445" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p missing-upgrade-824445
helpers_test.go:178: (dbg) Done: out/minikube-linux-arm64 delete -p missing-upgrade-824445: (2.105351668s)
--- PASS: TestMissingContainerUpgrade (175.24s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.09s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:108: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-884696 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:108: (dbg) Non-zero exit: out/minikube-linux-arm64 start -p NoKubernetes-884696 --no-kubernetes --kubernetes-version=v1.28.0 --driver=docker  --container-runtime=containerd: exit status 14 (91.549721ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-884696] minikube v1.37.0 on Ubuntu 20.04 (arm64)
	  - MINIKUBE_LOCATION=22021
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/22021-2487/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/22021-2487/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-arm64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.09s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (42.48s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:120: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-884696 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:120: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-884696 --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (41.955041693s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-884696 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (42.48s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (25.61s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:137: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-884696 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:137: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-884696 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (22.878380892s)
no_kubernetes_test.go:225: (dbg) Run:  out/minikube-linux-arm64 -p NoKubernetes-884696 status -o json
no_kubernetes_test.go:225: (dbg) Non-zero exit: out/minikube-linux-arm64 -p NoKubernetes-884696 status -o json: exit status 2 (372.652797ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-884696","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:149: (dbg) Run:  out/minikube-linux-arm64 delete -p NoKubernetes-884696
no_kubernetes_test.go:149: (dbg) Done: out/minikube-linux-arm64 delete -p NoKubernetes-884696: (2.355762722s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (25.61s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (9.4s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:161: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-884696 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:161: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-884696 --no-kubernetes --memory=3072 --alsologtostderr -v=5 --driver=docker  --container-runtime=containerd: (9.401853823s)
--- PASS: TestNoKubernetes/serial/Start (9.40s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads
no_kubernetes_test.go:89: Checking cache directory: /home/jenkins/minikube-integration/22021-2487/.minikube/cache/linux/arm64/v0.0.0
--- PASS: TestNoKubernetes/serial/VerifyNok8sNoK8sDownloads (0.00s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.43s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-884696 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-884696 "sudo systemctl is-active --quiet service kubelet": exit status 1 (430.333953ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.43s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (3.18s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:194: (dbg) Run:  out/minikube-linux-arm64 profile list
no_kubernetes_test.go:204: (dbg) Run:  out/minikube-linux-arm64 profile list --output=json
no_kubernetes_test.go:204: (dbg) Done: out/minikube-linux-arm64 profile list --output=json: (2.509175373s)
--- PASS: TestNoKubernetes/serial/ProfileList (3.18s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (1.3s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:183: (dbg) Run:  out/minikube-linux-arm64 stop -p NoKubernetes-884696
no_kubernetes_test.go:183: (dbg) Done: out/minikube-linux-arm64 stop -p NoKubernetes-884696: (1.29839609s)
--- PASS: TestNoKubernetes/serial/Stop (1.30s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (6.91s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:216: (dbg) Run:  out/minikube-linux-arm64 start -p NoKubernetes-884696 --driver=docker  --container-runtime=containerd
no_kubernetes_test.go:216: (dbg) Done: out/minikube-linux-arm64 start -p NoKubernetes-884696 --driver=docker  --container-runtime=containerd: (6.910831983s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (6.91s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.28s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:172: (dbg) Run:  out/minikube-linux-arm64 ssh -p NoKubernetes-884696 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:172: (dbg) Non-zero exit: out/minikube-linux-arm64 ssh -p NoKubernetes-884696 "sudo systemctl is-active --quiet service kubelet": exit status 1 (284.257567ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.28s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (11.21s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (11.21s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (305.96s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /tmp/minikube-v1.35.0.1890032336 start -p stopped-upgrade-629737 --memory=3072 --vm-driver=docker  --container-runtime=containerd
E1202 20:00:46.065549    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:01:03.868939    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:183: (dbg) Done: /tmp/minikube-v1.35.0.1890032336 start -p stopped-upgrade-629737 --memory=3072 --vm-driver=docker  --container-runtime=containerd: (36.300911516s)
version_upgrade_test.go:192: (dbg) Run:  /tmp/minikube-v1.35.0.1890032336 -p stopped-upgrade-629737 stop
version_upgrade_test.go:192: (dbg) Done: /tmp/minikube-v1.35.0.1890032336 -p stopped-upgrade-629737 stop: (1.25268535s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-linux-arm64 start -p stopped-upgrade-629737 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
E1202 20:02:09.138336    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:03:00.703904    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-224594/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
E1202 20:04:06.938860    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/functional-449836/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:198: (dbg) Done: out/minikube-linux-arm64 start -p stopped-upgrade-629737 --memory=3072 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (4m28.40505016s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (305.96s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.16s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-linux-arm64 logs -p stopped-upgrade-629737
E1202 20:05:46.065860    4435 cert_rotation.go:172] "Loading client cert failed" err="open /home/jenkins/minikube-integration/22021-2487/.minikube/profiles/addons-932514/client.crt: no such file or directory" logger="tls-transport-cache.UnhandledError" key="key"
version_upgrade_test.go:206: (dbg) Done: out/minikube-linux-arm64 logs -p stopped-upgrade-629737: (2.15781776s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.16s)

                                                
                                    
x
+
TestPause/serial/Start (82.91s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-linux-arm64 start -p pause-362069 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd
pause_test.go:80: (dbg) Done: out/minikube-linux-arm64 start -p pause-362069 --memory=3072 --install-addons=false --wait=all --driver=docker  --container-runtime=containerd: (1m22.913844947s)
--- PASS: TestPause/serial/Start (82.91s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (7.4s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-linux-arm64 start -p pause-362069 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd
pause_test.go:92: (dbg) Done: out/minikube-linux-arm64 start -p pause-362069 --alsologtostderr -v=1 --driver=docker  --container-runtime=containerd: (7.395300073s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (7.40s)

                                                
                                    
x
+
TestPause/serial/Pause (1.09s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-362069 --alsologtostderr -v=5
pause_test.go:110: (dbg) Done: out/minikube-linux-arm64 pause -p pause-362069 --alsologtostderr -v=5: (1.088206956s)
--- PASS: TestPause/serial/Pause (1.09s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.54s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-linux-arm64 status -p pause-362069 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-arm64 status -p pause-362069 --output=json --layout=cluster: exit status 2 (541.823786ms)

                                                
                                                
-- stdout --
	{"Name":"pause-362069","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 7 containers in: kube-system, kubernetes-dashboard, istio-operator","BinaryVersion":"v1.37.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-362069","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.54s)

                                                
                                    
x
+
TestPause/serial/Unpause (1.01s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-linux-arm64 unpause -p pause-362069 --alsologtostderr -v=5
pause_test.go:121: (dbg) Done: out/minikube-linux-arm64 unpause -p pause-362069 --alsologtostderr -v=5: (1.012467921s)
--- PASS: TestPause/serial/Unpause (1.01s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (1.27s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-linux-arm64 pause -p pause-362069 --alsologtostderr -v=5
pause_test.go:110: (dbg) Done: out/minikube-linux-arm64 pause -p pause-362069 --alsologtostderr -v=5: (1.2694365s)
--- PASS: TestPause/serial/PauseAgain (1.27s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (3.81s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-linux-arm64 delete -p pause-362069 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-arm64 delete -p pause-362069 --alsologtostderr -v=5: (3.812075282s)
--- PASS: TestPause/serial/DeletePaused (3.81s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (1.38s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-linux-arm64 profile list --output json
pause_test.go:142: (dbg) Done: out/minikube-linux-arm64 profile list --output json: (1.332603294s)
pause_test.go:168: (dbg) Run:  docker ps -a
pause_test.go:173: (dbg) Run:  docker volume inspect pause-362069
pause_test.go:173: (dbg) Non-zero exit: docker volume inspect pause-362069: exit status 1 (14.780027ms)

                                                
                                                
-- stdout --
	[]

                                                
                                                
-- /stdout --
** stderr ** 
	Error response from daemon: get pause-362069: no such volume

                                                
                                                
** /stderr **
pause_test.go:178: (dbg) Run:  docker network ls
--- PASS: TestPause/serial/VerifyDeletedResources (1.38s)

                                                
                                    

Test skip (34/321)

Order skiped test Duration
5 TestDownloadOnly/v1.28.0/cached-images 0
6 TestDownloadOnly/v1.28.0/binaries 0
7 TestDownloadOnly/v1.28.0/kubectl 0
14 TestDownloadOnly/v1.34.2/cached-images 0
15 TestDownloadOnly/v1.34.2/binaries 0
16 TestDownloadOnly/v1.34.2/kubectl 0
22 TestDownloadOnly/v1.35.0-beta.0/preload-exists 0.18
25 TestDownloadOnly/v1.35.0-beta.0/kubectl 0
29 TestDownloadOnlyKic 0.44
31 TestOffline 0
42 TestAddons/serial/GCPAuth/RealCredentials 0.01
49 TestAddons/parallel/Olm 0
56 TestAddons/parallel/AmdGpuDevicePlugin 0
60 TestDockerFlags 0
64 TestHyperKitDriverInstallOrUpdate 0
65 TestHyperkitDriverSkipUpgrade 0
112 TestFunctional/parallel/MySQL 0
116 TestFunctional/parallel/DockerEnv 0
117 TestFunctional/parallel/PodmanEnv 0
130 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0
131 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
132 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0
207 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL 0
211 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv 0
212 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv 0
224 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig 0
225 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0
226 TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS 0
261 TestGvisorAddon 0
283 TestImageBuild 0
284 TestISOImage 0
348 TestChangeNoneUser 0
351 TestScheduledStopWindows 0
353 TestSkaffold 0
x
+
TestDownloadOnly/v1.28.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.28.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.28.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.0/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.28.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/cached-images
aaa_download_only_test.go:128: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.34.2/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/binaries
aaa_download_only_test.go:150: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.34.2/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.34.2/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.34.2/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.34.2/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/preload-exists (0.18s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/preload-exists
I1202 18:48:08.952999    4435 preload.go:188] Checking if preload exists for k8s version v1.35.0-beta.0 and runtime containerd
W1202 18:48:09.083252    4435 preload.go:144] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.35.0-beta.0/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
W1202 18:48:09.129958    4435 preload.go:144] https://github.com/kubernetes-sigs/minikube-preloads/releases/download/v18/preloaded-images-k8s-v18-v1.35.0-beta.0-containerd-overlay2-arm64.tar.lz4 status code: 404
aaa_download_only_test.go:113: No preload image
--- SKIP: TestDownloadOnly/v1.35.0-beta.0/preload-exists (0.18s)

                                                
                                    
x
+
TestDownloadOnly/v1.35.0-beta.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.35.0-beta.0/kubectl
aaa_download_only_test.go:166: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.35.0-beta.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0.44s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:231: (dbg) Run:  out/minikube-linux-arm64 start --download-only -p download-docker-469003 --alsologtostderr --driver=docker  --container-runtime=containerd
aaa_download_only_test.go:248: Skip for arm64 platform. See https://github.com/kubernetes/minikube/issues/10144
helpers_test.go:175: Cleaning up "download-docker-469003" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-arm64 delete -p download-docker-469003
--- SKIP: TestDownloadOnlyKic (0.44s)

                                                
                                    
x
+
TestOffline (0s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:35: skipping TestOffline - only docker runtime supported on arm64. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestOffline (0.00s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/RealCredentials (0.01s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/RealCredentials
addons_test.go:759: This test requires a GCE instance (excluding Cloud Shell) with a container based driver
--- SKIP: TestAddons/serial/GCPAuth/RealCredentials (0.01s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:483: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestAddons/parallel/AmdGpuDevicePlugin (0s)

                                                
                                                
=== RUN   TestAddons/parallel/AmdGpuDevicePlugin
=== PAUSE TestAddons/parallel/AmdGpuDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/AmdGpuDevicePlugin
addons_test.go:1033: skip amd gpu test on all but docker driver and amd64 platform
--- SKIP: TestAddons/parallel/AmdGpuDevicePlugin (0.00s)

                                                
                                    
x
+
TestDockerFlags (0s)

                                                
                                                
=== RUN   TestDockerFlags
docker_test.go:41: skipping: only runs with docker container runtime, currently testing containerd
--- SKIP: TestDockerFlags (0.00s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
driver_install_or_update_test.go:37: Skip if not darwin.
--- SKIP: TestHyperKitDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade (0s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade
driver_install_or_update_test.go:101: Skip if not darwin.
--- SKIP: TestHyperkitDriverSkipUpgrade (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctional/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv
=== PAUSE TestFunctional/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL
functional_test.go:1792: arm64 is not supported by mysql. Skip the test. See https://github.com/kubernetes/minikube/issues/10144
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/MySQL (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv
functional_test.go:478: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv
=== PAUSE TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv
functional_test.go:565: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDig (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.00s)

                                                
                                    
x
+
TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS (0s)

                                                
                                                
=== RUN   TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:99: DNS forwarding is only supported for Hyperkit on Darwin, skipping test DNS forwarding
--- SKIP: TestFunctionalNewestKubernetes/Versionv1.35.0-beta.0/parallel/TunnelCmd/serial/AccessThroughDNS (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild (0s)

                                                
                                                
=== RUN   TestImageBuild
image_test.go:33: 
--- SKIP: TestImageBuild (0.00s)

                                                
                                    
x
+
TestISOImage (0s)

                                                
                                                
=== RUN   TestISOImage
iso_test.go:36: This test requires a VM driver
--- SKIP: TestISOImage (0.00s)

                                                
                                    
x
+
TestChangeNoneUser (0s)

                                                
                                                
=== RUN   TestChangeNoneUser
none_test.go:38: Test requires none driver and SUDO_USER env to not be empty
--- SKIP: TestChangeNoneUser (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestSkaffold (0s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:45: skaffold requires docker-env, currently testing containerd container runtime
--- SKIP: TestSkaffold (0.00s)

                                                
                                    
Copied to clipboard